top of page

Academic Projects

  • Project Title: Optimal Control for 1D Convection-Diffusion Equation. (Submitted on May 08, 2019)

       Course: MATH 4553: Linear and Nonlinear Programming (Oklahoma State University).

       Instructor: Dr. Weiwei Hu.

       Summary: 

In this project, an optimal temperature tracking problem is studied where the temperature distribution over a 1D rod is controlled by the optimal control input to attain the desired temperature distribution. To discretize the governing equation of the problem, a one-dimensional convection-diffusion equation, over a thin rod with the homogenous boundary the second-order finite-difference scheme is utilized for the diffusion term and forward Euler's method for the advection term. At first, the analytical optimal solution is obtained by applying Lagrange's theorem. Later, the analytical solution is compared with a numerical algorithm (for optimal control) solutions for different control weights. Finally, the numerical algorithm is implemented again with an inequality constraint for the control input. The results obtained by the algorithm are presented in terms of temperature distribution and control input vectors with respect to the rod's length. Both analytical and numerical solutions show a similar trend in the solutions with nearly similar values of objective functions. Based on the values of objective functional, the control weights with value 0.1 produce the optimal control to the system with the lower cost function value. However, increasing control weight which means adding more control yield more accurate results from the numerical solver with fewer iterations. [Full-text Report]

 

    Computational Tool: The optimization code is developed and the post-processings are done using Python programming language. 

​

​

​

​

​

​

​

​

​

           Figure: Comparison between the analytical temperature distribution and numerical temperature distribution for different                                    control values.

​

  • Project Title: Implementation of Machine Learning Algorithms on a Single-Layer Quasi-Geostrophic Ocean Circulation Model. (Submitted on December 12, 2018) 

       Course: CS 5783: Machine Learning (Oklahoma State University).

       Instructor: Dr. Christopher Crick.

       Summary: 

In this project, an ocean circulation model governed by barotropic vorticity equation (BVE) is solved using a physics-based solver combined with machine learning algorithms as a data-driven method to test whether the solution can be predicted by using a data-driven method or not. Traditionally, most of the atmospheric flow simulations have been done using physics-based turbulence modeling solvers. In physics-based solver, we first formulate a mathematical model using the governing equations defining the physics to be resolved, and then we solve those equations using numerical schemes and assumptions. Nevertheless, the main limitations of the physics-based modeling technique are the limited utilization of existing high-fidelity numerical data or experimental data as well as the overall computational cost to achieve a very accurate solution. To resolve these issues, very recently, another approach of modeling physical phenomena has been introduced to the computational fluid dynamics community which is known as the data-driven modeling approach. The goal of this approach to use existing data set and implement any suitable data-driven modeling algorithm to predict the flow solutions with a purpose to reduce the computational cost (since if the data-driven model works, it will be very faster than the high-performance physics-based solver) as well as to resolve some other issues (such as numerical instabilities while implementing numerical schemes) associated with traditional physics-based solvers. However, the implementation of data-driven, i.e., machine learning techniques to solve physical models in fluid dynamics is very new and it’s yet to find whether they can predict the flow solutions or not. In this project, an ocean model is tested with a combined physics-based and machine learning-based solver which has a quasi-stationary solution in general and this makes this test case a very challenging test case.

​

Computational Tool: The data (for training) is generated using a Fortran code for 2D QG flow and the ML algorithms are implemented using Python's Keras-Tensorflow tools. The post-processings are done using Python and Tecplot.

​

​

​

​

​

​

​

​

​

​

​

​

            Figure: Comparison between true solution and predicted stream function solutions by different linear regression models for                               the 2nd-time step. Training is done by the initial time step’s vorticity and stream function fields.

​

  • Project Title: 2-D Incompressible Navier-Stokes Equation Solver for a Channel Flow. (Submitted on May 13, 2018) 

       Course: MAE 6263: Computational Fluid Dynamics (Oklahoma State University).

       Instructor: Dr. Balaji Jayaraman.

       Part I: Developing Unsteady Two-dimensional Heat Diffusion Equation Solver. (Submitted on April 02, 2018) [Full-text Report]

       Part II: Developing 2D Nonlinear Burgers Equation Solver. (Submitted on April 17, 2018) [Full-text Report]

       Part III Summary: 

In project I and project II, the unsteady heat diffusion equation, and a model equation of the Navier-stokes and Burgers equations are developed. In this project, the pressure-velocity coupling will be taken into consideration by adding the pressure Poisson equation to the solution of the incompressible Navier-Stokes equation to model a 2-D pressure-driven channel flow. For incompressible flows, density is not a function of pressure, hence it remains constant whereas the energy equation becomes redundant for isothermal flows. Since the momentum equations find the velocity solutions, the pressure gets connected with the continuity equation. And as a result, mass conservation needs to be satisfied to calculate the pressure field correctly. That means, in each time step, it has to be made sure that the mass is conserved or the velocity field is divergence-free while calculating the pressure field using pressure Poisson equation. This is the most challenging part of the modeling of incompressible flows which will be observed and analyzed in this course project. [Full-text Report]

 

     Computational Tool: The codes are developed using Fortran programming language and the post-processings are done using    Tecplot. 

​

​

​

​

​

​

​

​

​

​

                         Figure: Isocontour plots of pressure and velocity fields for compact pressure Poisson solver.

​

​

​

​

​

​

​

​

         

Figure: Centerline plots for compact pressure Poisson solver.

​

​

​

​

​

​

​

​

​

​

Figure: Plots for energy and mass in the system for compact pressure Poisson solver.

​

  • Project Title: Developing Contactless Digital Tachometer to Measure Motor Shaft Rotation. (Submitted on April 05, 2014)

       Course: ME 362: Instrumentation and Measurement Sessional (Bangladesh University of Engineering & Technology).

       Instructor: Mr. Kazi Arafat Rahman and Mr. Aminul Islam Khan.

       Softwares: The microcontroller programming is done using WinAVR and AVR Studio, the designing is done in Solidworks. 

mlproject.PNG
avng01temp.png
avng01u.png
contour_pres_com.png
contour_velo_com.png
pvy_com.png
pvx_com.png
velo_com.png
en_com.png
mass_com.png
mn.PNG
enc.PNG
Photo0147.jpg
bottom of page