Search results for: Ghulam Jilani
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27

Search results for: Ghulam Jilani

27 Convective Heat Transfer Enhancement in an Enclosure with Fin Utilizing Nano Fluids

Authors: S. H. Anilkumar, Ghulam Jilani

Abstract:

The objective of the present work is to conduct investigations leading to a more complete explanation of single phase natural convective heat transfer in an enclosure with fin utilizing nano fluids. The nano fluid used, which is composed of Aluminum oxide nano particles in suspension of Ethylene glycol, is provided at various volume fractions. The study is carried out numerically for a range of Rayleigh numbers, fin heights and aspect ratio. The flow and temperature distributions are taken to be two-dimensional. Regions with the same velocity and temperature distributions are identified as symmetry of sections. One half of such a rectangular region is chosen as the computational domain taking into account the symmetry about the fin. Transport equations are modeled by a stream functionvorticity formulation and are solved numerically by finite-difference schemes. Comparisons with previously published works on the basis of special cases are done. Results are presented in the form of streamline, vector and isotherm plots as well as the variation of local Nusselt number along the fin under different conditions.

Keywords: Fin height, Nano fluid, natural convection, Rayleigh number.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
26 Levenberg-Marquardt Algorithm for Karachi Stock Exchange Share Rates Forecasting

Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil

Abstract:

Financial forecasting is an example of signal processing problems. A number of ways to train/learn the network are available. We have used Levenberg-Marquardt algorithm for error back-propagation for weight adjustment. Pre-processing of data has reduced much of the variation at large scale to small scale, reducing the variation of training data.

Keywords: Gradient descent method, jacobian matrix.Levenberg-Marquardt algorithm, quadratic error surfaces,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2472
25 Reductive Control in the Management of Redundant Actuation

Authors: Mkhinini Maher, Knani Jilani

Abstract:

We present in this work the performances of a mobile omnidirectional robot through evaluating its management of the redundancy of actuation. Thus we come to the predictive control implemented.

The distribution of the wringer on the robot actions, through the inverse pseudo of Moore-Penrose, corresponds to a « geometric ›› distribution of efforts. We will show that the load on vehicle wheels would not be equi-distributed in terms of wheels configuration and of robot movement.

Thus, the threshold of sliding is not the same for the three wheels of the vehicle. We suggest exploiting the redundancy of actuation to reduce the risk of wheels sliding and to ameliorate, thereby, its accuracy of displacement. This kind of approach was the subject of study for the legged robots.

Keywords: Mobile robot, actuation, redundancy, omnidirectional, inverse pseudo Moore-Penrose, reductive control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778
24 A Comparison of First and Second Order Training Algorithms for Artificial Neural Networks

Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil

Abstract:

Minimization methods for training feed-forward networks with Backpropagation are compared. Feedforward network training is a special case of functional minimization, where no explicit model of the data is assumed. Therefore due to the high dimensionality of the data, linearization of the training problem through use of orthogonal basis functions is not desirable. The focus is functional minimization on any basis. A number of methods based on local gradient and Hessian matrices are discussed. Modifications of many methods of first and second order training methods are considered. Using share rates data, experimentally it is proved that Conjugate gradient and Quasi Newton?s methods outperformed the Gradient Descent methods. In case of the Levenberg-Marquardt algorithm is of special interest in financial forecasting.

Keywords: Backpropagation algorithm, conjugacy condition, line search, matrix perturbation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3641
23 Fuzzy Metric Approach for Fuzzy Time Series Forecasting based on Frequency Density Based Partitioning

Authors: Tahseen Ahmed Jilani, Syed Muhammad Aqil Burney, C. Ardil

Abstract:

In the last 15 years, a number of methods have been proposed for forecasting based on fuzzy time series. Most of the fuzzy time series methods are presented for forecasting of enrollments at the University of Alabama. However, the forecasting accuracy rates of the existing methods are not good enough. In this paper, we compared our proposed new method of fuzzy time series forecasting with existing methods. Our method is based on frequency density based partitioning of the historical enrollment data. The proposed method belongs to the kth order and time-variant methods. The proposed method can get the best forecasting accuracy rate for forecasting enrollments than the existing methods.

Keywords: Fuzzy logical groups, fuzzified enrollments, fuzzysets, fuzzy time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3221
22 Fuzzy Time Series Forecasting Using Percentage Change as the Universe of Discourse

Authors: Meredith Stevenson, John E. Porter

Abstract:

Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.

Keywords: Fuzzy forecasting, fuzzy time series, fuzzified enrollments, time-invariant model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2503
21 Release Behavior of Biodegradable and Nonbiodegradable Polymeric Microparticles Loaded with Nimesulide

Authors: Shujaat A. Khan, Ghulam Murtaza

Abstract:

This presentation narrates the comparative analysis of the dissolution data nimesulide microparticles prepared with ethylcellulose, hydroxypropyl methylcellulose, chitosan and Poly(D,L-lactide-co-glycolide) as polymers. The analysis of release profiles showed that the variations noted in the release behavior of nimesulide from various microparticulate formulations are due to the nature of used polymer. In addition, maximum retardation in the nimesulide release was observed with HPMC (floating particles). Thus HPMC miacroparticles may be preferably employed for sustained release dosage form development.

Keywords: Nimesulide, microparticles, ethylcellulose, hydroxypropyl methylcellulose, chitosan and Poly(D, L-lactide-coglycolide).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
20 A New Quantile Based Fuzzy Time Series Forecasting Model

Authors: Tahseen A. Jilani, Aqil S. Burney, C. Ardil

Abstract:

Time series models have been used to make predictions of academic enrollments, weather, road accident, casualties and stock prices, etc. Based on the concepts of quartile regression models, we have developed a simple time variant quantile based fuzzy time series forecasting method. The proposed method bases the forecast using prediction of future trend of the data. In place of actual quantiles of the data at each point, we have converted the statistical concept into fuzzy concept by using fuzzy quantiles using fuzzy membership function ensemble. We have given a fuzzy metric to use the trend forecast and calculate the future value. The proposed model is applied for TAIFEX forecasting. It is shown that proposed method work best as compared to other models when compared with respect to model complexity and forecasting accuracy.

Keywords: Quantile Regression, Fuzzy time series, fuzzy logicalrelationship groups, heuristic trend prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
19 Approximate Bounded Knowledge Extraction Using Type-I Fuzzy Logic

Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil

Abstract:

Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.

Keywords: Crisp neural networks, fuzzy systems, extraction of logical rules, quasi-fuzzy numbers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
18 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application

Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil

Abstract:

In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or  absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.

Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2112
17 Salbutamol Sulphate-Ethylcellulose Tabletted Microcapsules: Pharmacokinetic Study using Convolution Approach

Authors: Ghulam Murtaza, Kalsoom Farzana

Abstract:

The aim of this article is to narrate the utility of novel simulation approach i.e. convolution method to predict blood concentration of drug utilizing dissolution data of salbutamol sulphate microparticulate formulations with different release patterns (1:1, 1:2 and 1:3, drug:polymer). Dissolution apparatus II USP 2007 and 900 ml double distilled water stirrd at 50 rpm was employed for dissolution analysis. From dissolution data, blood drug concentration was determined, and in return predicted blood drug concentration data was used to calculate the pharmacokinetic parameters i.e. Cmax, Tmax, and AUC. Convolution is a good biwaiver technique; however its better utility needs it application in the conditions where biorelevant dissolution media are used.

Keywords: Convolution, Dissolution, Pharmacokinetics, Salbutamol sulphate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2591
16 Multivariate High Order Fuzzy Time Series Forecasting for Car Road Accidents

Authors: Tahseen A. Jilani, S. M. Aqil Burney, C. Ardil

Abstract:

In this paper, we have presented a new multivariate fuzzy time series forecasting method. This method assumes mfactors with one main factor of interest. History of past three years is used for making new forecasts. This new method is applied in forecasting total number of car accidents in Belgium using four secondary factors. We also make comparison of our proposed method with existing methods of fuzzy time series forecasting. Experimentally, it is shown that our proposed method perform better than existing fuzzy time series forecasting methods. Practically, actuaries are interested in analysis of the patterns of causalities in road accidents. Thus using fuzzy time series, actuaries can define fuzzy premium and fuzzy underwriting of car insurance and life insurance for car insurance. National Institute of Statistics, Belgium provides region of risk classification for each road. Thus using this risk classification, we can predict premium rate and underwriting of insurance policy holders.

Keywords: Average forecasting error rate (AFER), Fuzziness offuzzy sets Fuzzy, If-Then rules, Multivariate fuzzy time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2488
15 Stochastic Scheduling to Minimize Expected Lateness in Multiple Identical Machines

Authors: Ghulam Zakria, Zailin Guan , Yasser Riaz Awan, Wan Lizhi

Abstract:

There are many real world problems in which parameters like the arrival time of new jobs, failure of resources, and completion time of jobs change continuously. This paper tackles the problem of scheduling jobs with random due dates on multiple identical machines in a stochastic environment. First to assign jobs to different machine centers LPT scheduling methods have been used, after that the particular sequence of jobs to be processed on the machine have been found using simple stochastic techniques. The performance parameter under consideration has been the maximum lateness concerning the stochastic due dates which are independent and exponentially distributed. At the end a relevant problem has been solved using the techniques in the paper..

Keywords: Quantity Production Flow Shop, LPT Scheduling, Stochastic Scheduling, Maximum Lateness, Random Due Dates

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1392
14 Solubility of CO2 in Aqueous Solutions of 2- Amino-2-Methyl-1-Propanol at High Pressure

Authors: Azmi Mohd Shariff, Ghulam Murshid, K.K. Lau, Mohammad Azmi Bustam, Faizan Ahamd

Abstract:

Carbon dioxide is one of the major green house gases. It is removed from different streams using amine absorption process. Sterically hindered amines are suggested as good CO2 absorbers. Solubility of carbon dioxide (CO2) was measured in aqueous solutions of 2-Amino-2-methyl-1-propanol (AMP) at temperatures 30 oC, 40 oC and 60 oC. The effect of pressure and temperature was studied over various concentrations of AMP. It has been found that pressure has positive effect on CO2 solubility where as solubility decreased with increasing temperature. Absorption performance of AMP increased with increasing pressure. Solubility of aqueous AMP was compared with mo-ethanolamine (MEA) and the absorption capacity of aqueous solutions of AMP was found to be better.

Keywords: Global warming, Carbon dioxide, Amine, Solubility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2587
13 Software Architecture Recovery

Authors: Ghulam Rasool, Nadim Asif

Abstract:

The advent of modern technology shadows its impetus repercussions on successful Legacy systems making them obsolete with time. These systems have evolved the large organizations in major problems in terms of new business requirements, response time, financial depreciation and maintenance. Major difficulty is due to constant system evolution and incomplete, inconsistent and obsolete documents which a legacy system tends to have. The myriad dimensions of these systems can only be explored by incorporating reverse engineering, in this context, is the best method to extract useful artifacts and by exploring these artifacts for reengineering existing legacy systems to meet new requirements of organizations. A case study is conducted on six different type of software systems having source code in different programming languages using the architectural recovery framework.

Keywords: Reverse Engineering, Architecture recovery, Architecture artifacts, Reengineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
12 Comparative Optical Analysis of Offset Reflector Antenna in GRASP

Authors: Ghulam Ahmad

Abstract:

In this paper comparison of Reflector Antenna analyzing techniques based on wave and ray nature of optics is presented for an offset reflector antenna using GRASP (General Reflector antenna Analysis Software Package) software. The results obtained using PO (Physical Optics), PTD (Physical theory of Diffraction), and GTD (Geometrical Theory of Diffraction) are compared. The validity of PO and GTD techniques in regions around the antenna, caustic behavior of GTD in main beam, and deviation of GTD in case of near-in sidelobes of radiation pattern are discussed. The comparison for far-out sidelobes predicted by PO, PO + PTD and GTD is described. The effect of Direct Radiations from feed which results in feed selection for the system is addressed.

Keywords: Geometrical optics & geometrical theory of diffraction, offset reflector antenna, physical optics & physical theory of diffraction, PO & GO comaprison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121
11 Corruption in India: Causes and Remedial Measures

Authors: Ghulam Nabi Naz

Abstract:

After independence, the popular belief that Gandhian will not indulge in corruption got a setback, post-independence setup paved the way for heavy corruption. The menace which would have dealt with strong legal provisions has become a way of life of Indian society. Corruption is recognized as the single biggest problem facing the country today. It undermines democracy and rule of law, violates human rights, distorts market and corrodes the moral fibre of people. The paper discusses the causes and possible remedial measures of corruption and response of people in Indian society. It emphasizes the factors which provide fertile ground for growth of corruption like, degradation of moral values, absence of a strong anti-corruption law and its effective enforcement, accountability, consistency and a defective system of fighting elections. The paper also highlights the reforms necessary for fighting corruption in India.

Keywords: Embezzlement, colonial, licence Raj, good governance, misappropriation, Sangh ideologue, Anna movement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2887
10 Recovering Artifacts from Legacy Systems Using Pattern Matching

Authors: Ghulam Rasool, Ilka Philippow

Abstract:

Modernizing legacy applications is the key issue facing IT managers today because there's enormous pressure on organizations to change the way they run their business to meet the new requirements. The importance of software maintenance and reengineering is forever increasing. Understanding the architecture of existing legacy applications is the most critical issue for maintenance and reengineering. The artifacts recovery can be facilitated with different recovery approaches, methods and tools. The existing methods provide static and dynamic set of techniques for extracting architectural information, but are not suitable for all users in different domains. This paper presents a simple and lightweight pattern extraction technique to extract different artifacts from legacy systems using regular expression pattern specifications with multiple language support. We used our custom-built tool DRT to recover artifacts from existing system at different levels of abstractions. In order to evaluate our approach a case study is conducted.

Keywords: Artifacts recovery, Pattern matching, Reverseengineering, Program understanding, Regular expressions, Sourcecode analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1896
9 Effect of Wheat Flour Extraction Rates on Flour Composition, Farinographic Characteristics and Sensory Perception of Sourdough Naans

Authors: Ghulam Mueen-ud-Din, Salim-ur-Rehman, Faqir M. Anjum, Haq Nawaz, Mian A. Murtaza

Abstract:

The effect of wheat flour extraction rates on flour composition, farinographic characteristics and the quality of sourdough naans was investigated. The results indicated that by increasing the extraction rate, the amount of protein, fiber, fat and ash increased, whereas moisture content decreased. Farinographic characteristic like water absorption and dough development time increased with an increase in flour extraction rate but the dough stabilities and tolerance indices were reduced with an increase in flour extraction rates. Titratable acidity for both sourdough and sourdough naans also increased along with flour extraction rate. The study showed that overall quality of sourdough naans were affected by both flour extraction rate and starter culture used. Sensory analysis of sourdough naans revealed that desirable extraction rate for sourdough naan was 76%.

Keywords: Extraction rates, Farinographic characteristics, Flour composition, Sourdough naans, Wheat flour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4671
8 EZW Coding System with Artificial Neural Networks

Authors: Saudagar Abdul Khader Jilani, Syed Abdul Sattar

Abstract:

Image compression plays a vital role in today-s communication. The limitation in allocated bandwidth leads to slower communication. To exchange the rate of transmission in the limited bandwidth the Image data must be compressed before transmission. Basically there are two types of compressions, 1) LOSSY compression and 2) LOSSLESS compression. Lossy compression though gives more compression compared to lossless compression; the accuracy in retrievation is less in case of lossy compression as compared to lossless compression. JPEG, JPEG2000 image compression system follows huffman coding for image compression. JPEG 2000 coding system use wavelet transform, which decompose the image into different levels, where the coefficient in each sub band are uncorrelated from coefficient of other sub bands. Embedded Zero tree wavelet (EZW) coding exploits the multi-resolution properties of the wavelet transform to give a computationally simple algorithm with better performance compared to existing wavelet transforms. For further improvement of compression applications other coding methods were recently been suggested. An ANN base approach is one such method. Artificial Neural Network has been applied to many problems in image processing and has demonstrated their superiority over classical methods when dealing with noisy or incomplete data for image compression applications. The performance analysis of different images is proposed with an analysis of EZW coding system with Error Backpropagation algorithm. The implementation and analysis shows approximately 30% more accuracy in retrieved image compare to the existing EZW coding system.

Keywords: Accuracy, Compression, EZW, JPEG2000, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1932
7 Metoprolol Tartrate-Ethylcellulose Tabletted Microparticles: Development of a Validated Invitro In-vivo Correlation

Authors: Fatima Rasool, Mahmood Ahmad, Ghulam Murtaza, Haji M. S. Khan, Shujaat A. Khan, Sonia Khiljee, Muhammad Qamar-Uz-Zaman

Abstract:

This study describes the methodology for the development of a validated in-vitro in-vivo correlation (IVIVC) for metoprolol tartrate modified release dosage forms with distinctive release rate characteristics. Modified release dosage forms were formulated by microencapsulation of metoprolol tartrate into different amounts of ethylcellulose by non-solvent addition technique. Then in-vitro and in-vivo studies were conducted to develop and validate level A IVIVC for metoprolol tartrate. The values of regression co-efficient (R2-values) for IVIVC of T2 and T3 formulations were not significantly (p<0.05) different from 1 while the values of R2 for IVIVC of T1 and Mepressor® were significantly (p<0.05) different from 1. Internal prediction errors of IVIVC, calculated from observed Area under Curve (AUC) and predicted AUC, were less than 10%. This study successfully presents a valid level A IVIVC for metoprolol tartrate modified dosage forms.

Keywords: Metoprolol tartrate, Dissolution, Bioavailability, Validated in-vitro in-vivo correlation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549
6 Learning and Practicing Assessment in a Pre-service Teacher Education Program: Comparative Perspective of UK and Pakistani Universities

Authors: Malik Ghulam Behlol, Alison Fox, Faiza Masood, Sabiha Arshad

Abstract:

This paper explores the barriers to the application of learning-supportive assessment at teaching practicum while investigating the role of university teachers (UT), cooperative teachers (CT), prospective teachers (PT) and heads of the practicum schools (HPS) in the selected universities of Pakistan and the UK. It is a qualitative case study and data were collected through the lesson observation of UT in the pre-service teacher education setting and PT in practicum schools. Interviews with UT, HPS, and Focus Group Discussions with PT were conducted too. The study has concluded that as compared to the UK counterpart, PTs in Pakistan face significant barriers in applying learning-supportive assessment in the school practicum settings because of large class sizes, lack of institutionalised collaboration between universities and schools, poor modelling of the lesson, ineffective feedback practices, lower order thinking assignments, and limited opportunities to use technology in school settings.

Keywords: Learning supportive assessment, pre-service teacher education, theory-practice gap, teacher education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 191
5 Solid Waste Management Challenges and Possible Solution in Kabul City

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Most developing nations face energy production and supply problems. This is also the case of Afghanistan whose generating capacity does not meet its energy demand. This is due in part to high security and risk caused by war which deters foreign investments and insufficient internal revenue. To address the issue above, this paper would like to suggest an alternative and affordable way to deal with the energy problem. That is by converting Solid Waste to energy. As a result, this approach tackles the municipal solid waste issue (potential cause of several diseases), contributes to the improvement of the quality of life, local economy, and so on. While addressing the solid waste problem in general, this paper samples specifically one municipality which is District-12, one of the 22 districts of Kabul city. Using geographic information system (GIS) technology, District-12 is divided into nine different zones whose municipal solid waste is respectively collected, processed, and converted into electricity and distributed to the closest area. It is important to mention that GIS has been used to estimate the amount of electricity to be distributed and to optimally position the production plant.

Keywords: Energy problem, estimation of electricity, GIS zones, solid waste management system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
4 Validation and Application of a New Optimized RP-HPLC-Fluorescent Detection Method for Norfloxacin

Authors: Mahmood Ahmad, Ghulam Murtaza, Sonia Khiljee, Muhammad Asadullah Madni

Abstract:

A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.

Keywords: Norfloxacin, Aceclofenac sodium, Methodoptimization, RP-HPLC method, Fluorescent detection, Calibrationcurve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2104
3 Study of Natural Patterns on Digital Image Correlation Using Simulation Method

Authors: Gang Li, Ghulam Mubashar Hassan, Arcady Dyskin, Cara MacNish

Abstract:

Digital image correlation (DIC) is a contactless fullfield displacement and strain reconstruction technique commonly used in the field of experimental mechanics. Comparing with physical measuring devices, such as strain gauges, which only provide very restricted coverage and are expensive to deploy widely, the DIC technique provides the result with full-field coverage and relative high accuracy using an inexpensive and simple experimental setup. It is very important to study the natural patterns effect on the DIC technique because the preparation of the artificial patterns is time consuming and hectic process. The objective of this research is to study the effect of using images having natural pattern on the performance of DIC. A systematical simulation method is used to build simulated deformed images used in DIC. A parameter (subset size) used in DIC can have an effect on the processing and accuracy of DIC and even cause DIC to failure. Regarding to the picture parameters (correlation coefficient), the higher similarity of two subset can lead the DIC process to fail and make the result more inaccurate. The pictures with good and bad quality for DIC methods have been presented and more importantly, it is a systematic way to evaluate the quality of the picture with natural patterns before they install the measurement devices.

Keywords: Digital image correlation (DIC), Deformation simulation, Natural pattern, Subset size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2797
2 Thermo-Physical Properties and Solubility of CO2 in Piperazine Activated Aqueous Solutions of β-Alanine

Authors: Ghulam Murshid

Abstract:

Carbon dioxide is one of the major greenhouse gas (GHG) contributors. It is an obligation of the industry to reduce the amount of carbon dioxide emission to the acceptable limits. Tremendous research and studies are reported in the past and still the quest to find the suitable and economical solution of this problem needed to be explored in order to develop the most plausible absorber for carbon dioxide removal. Amino acids can be potential alternate solvents for carbon dioxide capture from gaseous streams. This is due to its ability to resist oxidative degradation, low volatility and its ionic structure. In addition, the introduction of promoter-like piperazine to amino acid helps to further enhance the solubility. In this work, the effect of piperazine on thermo physical properties and solubility of β-Alanine aqueous solutions were studied for various concentrations. The measured physicochemical properties data was correlated as a function of temperature using least-squares method and the correlation parameters are reported together with it respective standard deviations. The effect of activator piperazine on the CO2 loading performance of selected amino acid under high-pressure conditions (1bar to 10bar) at temperature range of (30 to 60)oC was also studied. Solubility of CO2 decreases with increasing temperature and increases with increasing pressure. Quadratic representation of solubility using Response Surface Methodology (RSM) shows that the most important parameter to optimize solubility is system pressure. The addition of promoter increases the solubility effect of the solvent.

Keywords: Amino acids, CO2, Global warming, Solubility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3644
1 Three Computational Mathematics Techniques: Comparative Determination of Area under Curve

Authors: Khalid Pervaiz Akhter, Mahmood Ahmad, Ghulam Murtaza, Ishrat Shafi, Zafar Javed

Abstract:

The objective of this manuscript is to find area under the plasma concentration- time curve (AUC) for multiple doses of salbutamol sulphate sustained release tablets (Ventolin® oral tablets SR 8 mg, GSK, Pakistan) in the group of 18 healthy adults by using computational mathematics techniques. Following the administration of 4 doses of Ventolin® tablets 12 hourly to 24 healthy human subjects and bioanalysis of obtained plasma samples, plasma drug concentration-time profile was constructed. AUC, an important pharmacokinetic parameter, was measured using integrated equation of multiple oral dose regimens. The approximated AUC was also calculated by using computational mathematics techniques such as repeated rectangular, repeated trapezium and repeated Simpson's rule and compared with exact value of AUC calculated by using integrated equation of multiple oral dose regimens to find best computational mathematics method that gives AUC values closest to exact. The exact values of AUC for four consecutive doses of Ventolin® oral tablets were 150.5819473, 157.8131756, 164.4178231 and 162.78 ng.h/ml while the closest values approximated AUC values were 149.245962, 157.336171, 164.2585768 and 162.289224 ng.h/ml, respectively as found by repeated rectangular rule. The errors in the approximated values of AUC were negligible. It is concluded that all computational tools approximated values of AUC accurately but the repeated rectangular rule gives slightly better approximated values of AUC as compared to repeated trapezium and repeated Simpson's rules.

Keywords: Salbutamol sulphate, Area under curve (AUC), repeated rectangular rule, repeated trapezium rule, repeated Simpson's rule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1841