أنا  Ayad Ramadhan Ali


Assistant Lecturer

التخصصات

Numerical Analysis

التعليم

M.Sc. in Mathematics

Mathematics من University of Zakho

2022

B.Sc. in Mathematics

Zakho من Zakho

2015

اللقب العلمي

Assistant Lecturer

2023-10-25

Researcher

2022-12-20

Assistant Researcher

2017-10-01

البحوث العلمية

Jambura Journal of Mathematics (القضية : 1) (الحجم : 8)
A Hybrid Grey Wolf Optimizer-Zebra Optimization Algorithm for Solving Optimization Problems

Metaheuristic algorithms are widely applied to complex optimization problems, yet many suffer from premature convergence... See more

Metaheuristic algorithms are widely applied to complex optimization problems, yet many suffer from premature convergence or slow search efficiency. To address these limitations, this paper proposes a new hybrid algorithm, Grey Wolf Optimizer–Zebra Optimization Algorithm (GWO–ZOA). The algorithm integrates the exploitation ability of the Grey Wolf Optimizer with the exploration capability of the Zebra Optimization Algorithm in a sequential framework, thereby enhancing both convergence accuracy and global search ability. The performance of GWO–ZOA is first evaluated on 23 standard benchmark functions, where it demonstrates competitive results in both unimodal and multimodal landscapes. Further validation is carried out on the CEC2017 and CEC2020 benchmark suites, confirming the hybrid’s robustness across higher-dimensional and more challenging composite problems. In all three benchmark categories, the Friedman statistical test ranks GWO–ZOA first among the compared algorithms, highlighting its superior overall performance. Finally, the algorithm is applied to two real-world engineering design problems, where it consistently achieves high-quality feasible solutions and demonstrates practical effectiveness. These results confirm that the proposed GWO–ZOA algorithm is both robust and reliable for solving diverse and complex optimization tasks.

 2026-01
JOURNAL OF APPLIED COMPUTER SCIENCE & MATHEMATICS (القضية : 2) (الحجم : 19)
A Modified Sine–Cosine Algorithm with Improved Convergence for solving Optimization Problems

Swarm intelligence–based metaheuristics have emerged as powerful tools for solving complex optimization problems due to... See more

Swarm intelligence–based metaheuristics have emerged as powerful tools for solving complex optimization problems due to their adaptability and ease of implementation. Among them, the sine–cosine algorithm (SCA) is a well-known method, but it often suffers from slow convergence and premature stagnation in local optima. To address these limitations, this study introduces a modified sine–cosine algorithm (MSCA) that incorporates an adaptive operator to achieve a better balance between global exploration and local exploitation. The proposed MSCA was extensively evaluated using 23 classical benchmark functions, categorized into unimodal, multimodal, and fixed-dimension multimodal groups. Its performance was benchmarked against several state-of-the-art algorithms, and the standard SCA. Experimental results demonstrate that MSCA consistently outperforms the competitor algorithms in terms of convergence speed, accuracy, and robustness. Furthermore, statistical validation using the Wilcoxon rank-sum test and Friedman test confirms the significant superiority and scalability of MSCA across high-dimensional search spaces. Overall, the proposed MSCA offers a reliable and effective optimization framework with strong potential for addressing diverse and large-scale real-world applications.

 2025-11
IJISCS (International Journal of Information System and Computer Science) (القضية : 2) (الحجم : 9)
A NOVEL APPROACH: THREE-GROUP EXPLORATION STRATEGY ALGORITHM FOR SOLVING OPTIMIZATION PROBLEMS

In this study, we present a novel optimization technique, known as the Three-Group Exploration Strategy... See more

In this study, we present a novel optimization technique, known as the Three-Group Exploration Strategy (TGES) algorithm, specifically inspired by collaborative group dynamics often seen in problem-solving. We showed wide testing on 26 widely-recognized benchmark functions, providing a severe comparison between TGES and several well-established optimization algorithms. These results highlight TGES’s effectiveness in finding optimal solutions with high reliability and accuracy. Furthermore, the practical applications of TGES are demonstrated by successfully solving six interesting, real-world engineering problems, showcasing its adaptability and robustness. The experimental results indicate that TGES not only exhibits superior optimization performance, but it also achieves faster convergence and higher solution quality compared to several leading algorithms. This finds TGES algorithm as a strong and adaptable tool for solving a variety of engineering optimization problems.

 2025-08
Journal of Duhok University (القضية : 2) (الحجم : 25)
HYBRIDIZATION GRADIENT BASED METHODS WITH GENETIC ALGORITHM FOR SOLVING SYSTEMS OF LINEAR EQUATIONS

In this paper, we propose two hybrid gradient based methods and genetic algorithm for solving... See more

In this paper, we propose two hybrid gradient based methods and genetic algorithm for solving systems of linear equations with fast convergence. The first proposed hybrid method is obtained by using the steepest descent method and the second one by the Cauchy-Barzilai-Borwein method. These algorithms are based on minimizing the residual of solution which has genetic characteristics. They are compared with the normal genetic algorithm and standard gradient based methods in order to show the accuracy and the convergence speed of them. Since the conjugate gradient method is recommended for solving large sparse and symmetric positive definite matrices, we also compare the numerical results of our proposed algorithms with this method. The numerical results demonstrate the robustness and efficiency of the proposed algorithms. Moreover, we observe that our hybridization of the CBB method and genetic algorithm gives more accurate results with faster convergence than other mentioned methods in all given cases.

 2022-11
General Letters in Mathematics (GLM) (القضية : 2) (الحجم : 12)
New search direction of steepest descent method for solving large linear systems

The steepest descent (SD) method is well-known as the simplest method in optimization. In this... See more

The steepest descent (SD) method is well-known as the simplest method in optimization. In this paper, we propose a new SD search direction for solving system of linear equations Ax = b. We also prove that the proposed SD method with exact line search satisfies descent condition and possesses global convergence properties. This proposed method is motivated by previous work on the SD method by Zubai’ah-Mustafa-Rivaie-Ismail (ZMRI)[2]. Numerical comparisons with a classical SD algorithm and ZMRI algorithm show that this algorithm is very effective depending on the number of iterations (NOI) and CPU time.

 2022-08

الاطاريح

2022
New Gradient Optimization and Genetic Algorithms Hybridized for Fast Convergence

Hybrid Between New Gradient Methods and Genetic Algorithm for solving Linear Optimization Problems

 2026

الدورات التدريبية

2023-01-01,2023-06-28
Pedagogy Training Course

Pedagogy Training Course

 2023
2021-09-05,2021-10-27
Pre-Intermediate

Training Course on English

 2021
2020-10-29,2020-12-22
Elementary

Training Course on English

 2020