Search results for: Naveed Sarfraz Khattak
28 Despeckling of Synthetic Aperture Radar Images Using Inner Product Spaces in Undecimated Wavelet Domain
Authors: Syed Musharaf Ali, Muhammad Younus Javed, Naveed Sarfraz Khattak, Athar Mohsin, UmarFarooq
Abstract:
This paper introduces the effective speckle reduction of synthetic aperture radar (SAR) images using inner product spaces in undecimated wavelet domain. There are two major areas in projection onto span algorithm where improvement can be made. First is the use of undecimated wavelet transformation instead of discrete wavelet transformation. And second area is the use of smoothing filter namely directional smoothing filter which is an additional step. Proposed method does not need any noise estimation and thresholding technique. More over proposed method gives good results on both single polarimetric and fully polarimetric SAR images.Keywords: Directional Smoothing, Inner product, Length ofvector, Undecimated wavelet transformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161127 A Robust Method for Encrypted Data Hiding Technique Based on Neighborhood Pixels Information
Authors: Ali Shariq Imran, M. Younus Javed, Naveed Sarfraz Khattak
Abstract:
This paper presents a novel method for data hiding based on neighborhood pixels information to calculate the number of bits that can be used for substitution and modified Least Significant Bits technique for data embedding. The modified solution is independent of the nature of the data to be hidden and gives correct results along with un-noticeable image degradation. The technique, to find the number of bits that can be used for data hiding, uses the green component of the image as it is less sensitive to human eye and thus it is totally impossible for human eye to predict whether the image is encrypted or not. The application further encrypts the data using a custom designed algorithm before embedding bits into image for further security. The overall process consists of three main modules namely embedding, encryption and extraction cm.
Keywords: Data hiding, image processing, information security, stagonography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234126 Wavelet-Based Despeckling of Synthetic Aperture Radar Images Using Adaptive and Mean Filters
Authors: Syed Musharaf Ali, Muhammad Younus Javed, Naveed Sarfraz Khattak
Abstract:
In this paper we introduced new wavelet based algorithm for speckle reduction of synthetic aperture radar images, which uses combination of undecimated wavelet transformation, wiener filter (which is an adaptive filter) and mean filter. Further more instead of using existing thresholding techniques such as sure shrinkage, Bayesian shrinkage, universal thresholding, normal thresholding, visu thresholding, soft and hard thresholding, we use brute force thresholding, which iteratively run the whole algorithm for each possible candidate value of threshold and saves each result in array and finally selects the value for threshold that gives best possible results. That is why it is slow as compared to existing thresholding techniques but gives best results under the given algorithm for speckle reduction.
Keywords: Brute force thresholding, directional smoothing, direction dependent mask, undecimated wavelet transformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 288025 Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes
Authors: Muhammad Sajjad, Naveed Khattak, Noman Jafri
Abstract:
World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.Keywords: Adaptive, digital image processing, imagemagnification, interpolation, geometrical shapes, qualitative &quantitative analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180024 Mercury Removal Techniques for Industrial Waste Water
Authors: Amir Shafeeq, Ayyaz Muhammad, Waqas Sarfraz, Ali Toqeer, Shazib Rashid, M. K. Rafiq
Abstract:
The current work focuses on rephrasing the harmful effects of mercury that is being released from a number of sources. Most of the sources are from the industrial waste water. Different techniques of mercury removal have been discussed and a brief comparison among these has been made. The experimental work has been conducted for two most widely used methods of mercury removal and comparison in terms of their efficiency has been made.Keywords: Mercury, Waste Water, Adsorption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1080623 Maximum Entropy Based Image Segmentation of Human Skin Lesion
Authors: Sheema Shuja Khattak, Gule Saman, Imran Khan, Abdus Salam
Abstract:
Image segmentation plays an important role in medical imaging applications. Therefore, accurate methods are needed for the successful segmentation of medical images for diagnosis and detection of various diseases. In this paper, we have used maximum entropy to achieve image segmentation. Maximum entropy has been calculated using Shannon, Renyi and Tsallis entropies. This work has novelty based on the detection of skin lesion caused by the bite of a parasite called Sand Fly causing the disease is called Cutaneous Leishmaniasis.
Keywords: Shannon, Maximum entropy, Renyi, Tsallis entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 239522 Selection of Appropriate Classification Technique for Lithological Mapping of Gali Jagir Area, Pakistan
Authors: Khunsa Fatima, Umar K. Khattak, Allah Bakhsh Kausar
Abstract:
Satellite images interpretation and analysis assist geologists by providing valuable information about geology and minerals of an area to be surveyed. A test site in Fatejang of district Attock has been studied using Landsat ETM+ and ASTER satellite images for lithological mapping. Five different supervised image classification techniques namely maximum likelihood, parallelepiped, minimum distance to mean, mahalanobis distance and spectral angle mapper have been performed upon both satellite data images to find out the suitable classification technique for lithological mapping in the study area. Results of these five image classification techniques were compared with the geological map produced by Geological Survey of Pakistan. Result of maximum likelihood classification technique applied on ASTER satellite image has highest correlation of 0.66 with the geological map. Field observations and XRD spectra of field samples also verified the results. A lithological map was then prepared based on the maximum likelihood classification of ASTER satellite image.
Keywords: ASTER, Landsat-ETM+, Satellite, Image classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 292021 Effect of Different BER Performance Comparison of MAP and ML Detection
Authors: Naveed Ur Rehman, Rehan Jamil, Irfan Jamil
Abstract:
In this paper, we regard as a coded transmission over a frequency-selective channel. We plan to study analytically the convergence of the turbo-detector using a maximum a posteriori (MAP) equalizer and a MAP decoder. We demonstrate that the densities of the maximum likelihood (ML) exchanged during the iterations are e-symmetric and output-symmetric. Under the Gaussian approximation, this property allows to execute a one-dimensional scrutiny of the turbo-detector. By deriving the analytical terminology of the ML distributions under the Gaussian approximation, we confirm that the bit error rate (BER) performance of the turbo-detector converges to the BER performance of the coded additive white Gaussian noise (AWGN) channel at high signal to noise ratio (SNR), for any frequency selective channel.
Keywords: MAP, ML, SNR, Decoder, BER, Coded transmission.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225620 Numerical Studies of Galerkin-type Time-discretizations Applied to Transient Convection-diffusion-reaction Equations
Authors: Naveed Ahmed, Gunar Matthies
Abstract:
We deal with the numerical solution of time-dependent convection-diffusion-reaction equations. We combine the local projection stabilization method for the space discretization with two different time discretization schemes: the continuous Galerkin-Petrov (cGP) method and the discontinuous Galerkin (dG) method of polynomial of degree k. We establish the optimal error estimates and present numerical results which shows that the cGP(k) and dG(k)- methods are accurate of order k +1, respectively, in the whole time interval. Moreover, the cGP(k)-method is superconvergent of order 2k and dG(k)-method is of order 2k +1 at the discrete time points. Furthermore, the dependence of the results on the choice of the stabilization parameter are discussed and compared.
Keywords: Convection-diffusion-reaction equations, stabilized finite elements, discontinuous Galerkin, continuous Galerkin-Petrov.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175019 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data
Authors: Salam Khalifa, Naveed Ahmed
Abstract:
We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignement method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.
Keywords: 3D video, 3D animation, RGB-D video, Temporally Coherent 3D Animation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207218 Solid Particle Erosion of Heat Treated TNB-V4 at Ambient and Elevated Temperatures
Authors: Muhammad Naveed, Richard Stechow, Sebastian Bolz, Katharina Hobusch, Sabine Weiß
Abstract:
Solid particle erosion has been identified as a critical wear phenomenon which takes place during operation of aeroengines in dusty environment. The present work discusses the erosion behavior of Ti-44.5Al-6.25Nb-0.8Mo-0.1B alloy (TNB-V4) which finds its application in low pressure gas turbines and can be used for high pressure compressors too. Prior to the erosion tests, the alloy was heat treated to improve the mechanical properties. Afterwards, specimens were eroded at impact angles of 30° and 90° at room and high temperatures (100 °C-400 °C). Volume loss and erosion behavior are studied through gravimetric analysis, whereas erosion mechanisms are characterized through scanning electron microscopy. The results indicate a clear difference in the erosion mechanism for different impact angles. The influence of the test temperature on the erosion behavior of the alloy is also discussed in the present contribution.
Keywords: Solid particle erosion, gamma TiAl, TNB-V4, high temperature erosion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151717 Aerodynamic Stall Control of a Generic Airfoil using Synthetic Jet Actuator
Authors: Basharat Ali Haider, Naveed Durrani, Nadeem Aizud, Salimuddin Zahir
Abstract:
The aerodynamic stall control of a baseline 13-percent thick NASA GA(W)-2 airfoil using a synthetic jet actuator (SJA) is presented in this paper. Unsteady Reynolds-averaged Navier-Stokes equations are solved on a hybrid grid using a commercial software to simulate the effects of a synthetic jet actuator located at 13% of the chord from the leading edge at a Reynolds number Re = 2.1x106 and incidence angles from 16 to 22 degrees. The experimental data for the pressure distribution at Re = 3x106 and aerodynamic coefficients at Re = 2.1x106 (angle of attack varied from -16 to 22 degrees) without SJA is compared with the computational fluid dynamic (CFD) simulation as a baseline validation. A good agreement of the CFD simulations is obtained for aerodynamic coefficients and pressure distribution. A working SJA has been integrated with the baseline airfoil and initial focus is on the aerodynamic stall control at angles of attack from 16 to 22 degrees. The results show a noticeable improvement in the aerodynamic performance with increase in lift and decrease in drag at these post stall regimes.Keywords: Active flow control, Aerodynamic stall, Airfoilperformance, Synthetic jet actuator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231116 Enhanced Interference Management Technique for Multi-Cell Multi-Antenna System
Authors: Simon E. Uguru, Victor E. Idigo, Obinna S. Oguejiofor, Naveed Nawaz
Abstract:
As the deployment of the Fifth Generation (5G) mobile communication networks take shape all over the world, achieving spectral efficiency, energy efficiency, and dealing with interference are among the greatest challenges encountered so far. The aim of this study is to mitigate inter-cell interference (ICI) in a multi-cell multi-antenna system while maximizing the spectral efficiency of the system. In this study, a system model was devised that showed a miniature representation of a multi-cell multi-antenna system. Based on this system model, a convex optimization problem was formulated to maximize the spectral efficiency of the system while mitigating the ICI. This optimization problem was solved using CVX, which is a modeling system for constructing and solving discipline convex programs. The solutions to the optimization problem are sub-optimal coordinated beamformers. These coordinated beamformers direct each data to the served user equipments (UEs) in each cell without interference during downlink transmission, thereby maximizing the system-wide spectral efficiency.
Keywords: coordinated beamforming, convex optimization, inter-cell interference, multi-antenna, multi-cell, spectral efficiency
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44815 Neighbour Cell List Reduction in Multi-Tier Heterogeneous Networks
Authors: Mohanad Alhabo, Naveed Nawaz
Abstract:
The ongoing call or data session must be maintained to ensure a good quality of service. This can be accomplished by performing handover procedure while the user is on the move. However, dense deployment of small cells in 5G networks is a challenging issue due to the extensive number of handovers. In this paper, a neighbour cell list method is proposed to reduce the number of target small cells and hence minimizing the number of handovers. The neighbour cell list is built by omitting cells that could cause an unnecessary handover and/or handover failure because of short time of stay of a user in these cells. A multi-attribute decision making technique, simple additive weighting, is then applied to the optimized neighbour cell list. The performance of the proposed method is analysed and compared with that of the existing methods. Results disclose that our method decreases the candidate small cell list, unnecessary handovers, handover failure and short time of stay cells compared to the competitive method.
Keywords: Handover, HetNets, MADM, small cells.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53814 Need of National Space Legislation for Space Faring Nations
Authors: Muhammad Naveed, Yang Caixia
Abstract:
The need for national space legislation is pivotal, particularly in light of the fact that in recent years space activities have grown immensely both in volume and diversity. Countries are progressively developing capabilities in space exploration and scientific discoveries, market their capabilities to manufacture satellites, provide launch services from their facilities and are looking to privatize and commercialize their space resources. Today, nations are also seeking to comprehend the technological and financial potential of the private sector and are considering to share their financial burdens with them and to limit their exposures to risks, but they are lagging behind in legal framework in this regard. In the perspective of these emerging developments, it is therefore, felt that national space legislation should be enacted with the goal of building and implementing a vibrant and transparent legal framework at the national level to hasten investments and to ensure growth in this capital intensive - highly yield strategic sector. This study looks at (I) the international legal framework that governs space activities; (II) motivation behind making national space laws; and (III) the need for national space legislation. The paper concludes with some recommendations with regards to the conceivable future direction for national space legislation, in particular space empowered sub-areas for countries.
Keywords: International conventions, national legislation, space faring nation, space law.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 120613 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks
Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris
Abstract:
The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.Keywords: Energy efficiency, handover, HetNets, MADM, small cells.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49612 Design Transformation to Reduce Cost in Irrigation Using Value Engineering
Authors: F. S. Al-Anzi, M. Sarfraz, A. Elmi, A. R. Khan
Abstract:
Researchers are responding to the environmental challenges of Kuwait in localized, innovative, effective and economic ways. One of the vital and significant examples of the natural challenges is lack or water and desertification. In this research, the project team focuses on redesigning a prototype, using Value Engineering Methodology, which would provide similar functionalities to the well-known technology of Waterboxx kits while reducing the capital and operational costs and simplifying the process of manufacturing and usability by regular farmers. The design employs used tires and recycled plastic sheets as raw materials. Hence, this approach is going to help not just fighting desertification but also helping in getting rid of ever growing huge tire dumpsters in Kuwait, as well as helping in avoiding hazards of tire fires yielding in a safer and friendlier environment. Several alternatives for implementing the prototype have been considered. The best alternative in terms of value has been selected after thorough Function Analysis System Technique (FAST) exercise has been developed. A prototype has been fabricated and tested in a controlled simulated lab environment that is being followed by real environment field testing. Water and soil analysis conducted on the site of the experiment to cross compare between the composition of the soil before and after the experiment to insure that the prototype being tested is actually going to be environment safe. Experimentation shows that the design was equally as effective as, and may exceed, the original design with significant savings in cost. An estimated total cost reduction using the VE approach of 43.84% over the original design. This cost reduction does not consider the intangible costs of environmental issue of waste recycling which many further intensify the total savings of using the alternative VE design. This case study shows that Value Engineering Methodology can be an important tool in innovating new designs for reducing costs.
Keywords: Desertification, functional analysis, scrap tires, value engineering, waste recycling, water irrigation rationing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146111 Handover for Dense Small Cells Heterogeneous Networks: A Power-Efficient Game Theoretical Approach
Authors: Mohanad Alhabo, Li Zhang, Naveed Nawaz
Abstract:
In this paper, a non-cooperative game method is formulated where all players compete to transmit at higher power. Every base station represents a player in the game. The game is solved by obtaining the Nash equilibrium (NE) where the game converges to optimality. The proposed method, named Power Efficient Handover Game Theoretic (PEHO-GT) approach, aims to control the handover in dense small cell networks. Players optimize their payoff by adjusting the transmission power to improve the performance in terms of throughput, handover, power consumption and load balancing. To select the desired transmission power for a player, the payoff function considers the gain of increasing the transmission power. Then, the cell selection takes place by deploying Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A game theoretical method is implemented for heterogeneous networks to validate the improvement obtained. Results reveal that the proposed method gives a throughput improvement while reducing the power consumption and minimizing the frequent handover.Keywords: Energy efficiency, game theory, handover, HetNets, small cells.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46810 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks
Authors: Naveed Ghani, Samreen Javed
Abstract:
In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.
Keywords: Network worms, malware infection propagating malicious code, virus, security, VPN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28119 Representing Shared Join Points with State Charts: A High Level Design Approach
Authors: Muhammad Naveed, Muhammad Khalid Abdullah, Khalid Rashid, Hafiz Farooq Ahmad
Abstract:
Aspect Oriented Programming promises many advantages at programming level by incorporating the cross cutting concerns into separate units, called aspects. Join Points are distinguishing features of Aspect Oriented Programming as they define the points where core requirements and crosscutting concerns are (inter)connected. Currently, there is a problem of multiple aspects- composition at the same join point, which introduces the issues like ordering and controlling of these superimposed aspects. Dynamic strategies are required to handle these issues as early as possible. State chart is an effective modeling tool to capture dynamic behavior at high level design. This paper provides methodology to formulate the strategies for multiple aspect composition at high level, which helps to better implement these strategies at coding level. It also highlights the need of designing shared join point at high level, by providing the solutions of these issues using state chart diagrams in UML 2.0. High level design representation of shared join points also helps to implement the designed strategy in systematic way.Keywords: Aspect Oriented Software Development, Shared Join Points.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17178 Hybrid Weighted Multiple Attribute Decision Making Handover Method for Heterogeneous Networks
Authors: Mohanad Alhabo, Li Zhang, Naveed Nawaz
Abstract:
Small cell deployment in 5G networks is a promising technology to enhance the capacity and coverage. However, unplanned deployment may cause high interference levels and high number of unnecessary handovers, which in turn result in an increase in the signalling overhead. To guarantee service continuity, minimize unnecessary handovers and reduce signalling overhead in heterogeneous networks, it is essential to properly model the handover decision problem. In this paper, we model the handover decision problem using Multiple Attribute Decision Making (MADM) method, specifically Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS), and propose a hybrid TOPSIS method to control the handover in heterogeneous network. The proposed method adopts a hybrid weighting policy, which is a combination of entropy and standard deviation. A hybrid weighting control parameter is introduced to balance the impact of the standard deviation and entropy weighting on the network selection process and the overall performance. Our proposed method show better performance, in terms of the number of frequent handovers and the mean user throughput, compared to the existing methods.
Keywords: Handover, HetNets, interference, MADM, small cells, TOPSIS, weight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5737 New Simultaneous High Performance Liquid Chromatographic Method for Determination of NSAIDs and Opioid Analgesics in Advanced Drug Delivery Systems and Human Plasma
Authors: Asad Ullah Madni, Mahmood Ahmad, Naveed Akhtar, Muhammad Usman
Abstract:
A new and cost effective RP-HPLC method was developed and validated for simultaneous analysis of non steroidal anti inflammatory dugs Diclofenac sodium (DFS), Flurbiprofen (FLP) and an opioid analgesic Tramadol (TMD) in advanced drug delivery systems (Liposome and Microcapsules), marketed brands and human plasma. Isocratic system was employed for the flow of mobile phase consisting of 10 mM sodium dihydrogen phosphate buffer and acetonitrile in molar ratio of 67: 33 with adjusted pH of 3.2. The stationary phase was hypersil ODS column (C18, 250×4.6 mm i.d., 5 μm) with controlled temperature of 30 C°. DFS in liposomes, microcapsules and marketed drug products was determined in range of 99.76-99.84%. FLP and TMD in microcapsules and brands formulation were 99.78 - 99.94 % and 99.80 - 99.82 %, respectively. Single step liquid-liquid extraction procedure using combination of acetonitrile and trichloroacetic acid (TCA) as protein precipitating agent was employed. The detection limits (at S/N ratio 3) of quality control solutions and plasma samples were 10, 20, and 20 ng/ml for DFS, FLP and TMD, respectively. The Assay was acceptable in linear dynamic range. All other validation parameters were found in limits of FDA and ICH method validation guidelines. The proposed method is sensitive, accurate and precise and could be applicable for routine analysis in pharmaceutical industry as well as in human plasma samples for bioequivalence and pharmacokinetics studies.Keywords: Diclofenac Sodium, Flurbiprofen, Tramadol, HPLCUV detection, Validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18596 In Vivo Evaluation of Stable Cream Containing Flavonoids on Hydration and TEWL of Human Skin
Authors: Haji M Shoaib Khan, Naveed Akhtar, Fatima Rasool, Barkat Ali Khan, Tariq Mahmood, Muhammad Shuaib Khan
Abstract:
Antioxidants contribute to endogenous photoprotection and are important for the maintenance of skin health. The study was carried out to compare the skin hydration and transepidermal water loss (TEWL) effects of a stable cosmetic preparation containing flavonoids, following two applications a day over a period of tenth week. The skin trans-epidermal water loss and skin hydration effect was measured at the beginning and up to the end of study period of ten weeks. Any effect produced was measured by Corneometer and TEWA meter (Non-invasive probe). Two formulations were developed for this study design. Formulation one the control formulation in which no apple juice extract( Flavonoids) was incorporated while second one was the active formulation in which the apple juice extract (3%) containing flavonoids was incorporated into water in oil emulsion using Abil EM 90 as an emulsifier. Stable formulations (control and Active) were applied on human cheeks (n = 12) for a study period of 10 weeks. Result of each volunteer of skin hydration and TEWL was measured by corneometer and TEWA meter. By using ANOVA and Paired sample t test as a statistical evaluation, result of both base and formulation were compared. Statistical significant results (p≤0.05) were observed regarding skin hydration and TEWL when two creams, control and Formulation were compared. It showed that desired formulation (Active) may have interesting application as an active moisturizing cream on healthy skin.Keywords: Apple juice extract, TEWL, Corneometer, flavonoids.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26935 Pharmaceutical Microencapsulation Technology for Development of Controlled Release Drug Delivery systems
Authors: Mahmood Ahmad, Asadullah Madni, Muhammad Usman, Abubakar Munir, Naveed Akhtar, Haji M. Shoaib Khan
Abstract:
This article demonstrated development of controlled release system of an NSAID drug, Diclofenac sodium employing different ratios of Ethyl cellulose. Diclofenac sodium and ethyl cellulose in different proportions were processed by microencapsulation based on phase separation technique to formulate microcapsules. The prepared microcapsules were then compressed into tablets to obtain controlled release oral formulations. In-vitro evaluation was performed by dissolution test of each preparation was conducted in 900 ml of phosphate buffer solution of pH 7.2 maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12, 16, 20 and 24 hrs). The drug concentration in the collected samples was determined by UV spectrophotometer at 276 nm. The physical characteristics of diclofenac sodium microcapsules were according to accepted range. These were off-white, free flowing and spherical in shape. The release profile of diclofenac sodium from microcapsules was found to be directly proportional to the proportion of ethylcellulose and coat thickness. The in-vitro release pattern showed that with ratio of 1:1 and 1:2 (drug: polymer), the percentage release of drug at first hour was 16.91 and 11.52 %, respectively as compared to 1:3 which is only 6.87 % with in this time. The release mechanism followed higuchi model for its release pattern. Tablet Formulation (F2) of present study was found comparable in release profile the marketed brand Phlogin-SR, microcapsules showed an extended release beyond 24 h. Further, a good correlation was found between drug release and proportion of ethylcellulose in the microcapsules. Microencapsulation based on coacervation found as good technique to control release of diclofenac sodium for making the controlled release formulations.Keywords: Diclofenac sodium, Microencapsulationtechnology, Ethylcellulose, In-Vitro Release Profile
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31614 Analysis of Rural Roads in Developing Countries Using Principal Component Analysis and Simple Average Technique in the Development of a Road Safety Performance Index
Authors: Muhammad Tufail, Jawad Hussain, Hammad Hussain, Imran Hafeez, Naveed Ahmad
Abstract:
Road safety performance index is a composite index which combines various indicators of road safety into single number. Development of a road safety performance index using appropriate safety performance indicators is essential to enhance road safety. However, a road safety performance index in developing countries has not been given as much priority as needed. The primary objective of this research is to develop a general Road Safety Performance Index (RSPI) for developing countries based on the facility as well as behavior of road user. The secondary objectives include finding the critical inputs in the RSPI and finding the better method of making the index. In this study, the RSPI is developed by selecting four main safety performance indicators i.e., protective system (seat belt, helmet etc.), road (road width, signalized intersections, number of lanes, speed limit), number of pedestrians, and number of vehicles. Data on these four safety performance indicators were collected using observation survey on a 20 km road section of the National Highway N-125 road Taxila, Pakistan. For the development of this composite index, two methods are used: a) Principal Component Analysis (PCA) and b) Equal Weighting (EW) method. PCA is used for extraction, weighting, and linear aggregation of indicators to obtain a single value. An individual index score was calculated for each road section by multiplication of weights and standardized values of each safety performance indicator. However, Simple Average technique was used for weighting and linear aggregation of indicators to develop a RSPI. The road sections are ranked according to RSPI scores using both methods. The two weighting methods are compared, and the PCA method is found to be much more reliable than the Simple Average Technique.
Keywords: Aggregation, index score, indicators, principal component analysis, weighting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5723 Data Projects for “Social Good”: Challenges and Opportunities
Authors: Mikel Niño, Roberto V. Zicari, Todor Ivanov, Kim Hee, Naveed Mushtaq, Marten Rosselli, Concha Sánchez-Ocaña, Karsten Tolle, José Miguel Blanco, Arantza Illarramendi, Jörg Besier, Harry Underwood
Abstract:
One of the application fields for data analysis techniques and technologies gaining momentum is the area of social good or “common good”, covering cases related to humanitarian crises, global health care, or ecology and environmental issues, among others. The promotion of data-driven projects in this field aims at increasing the efficacy and efficiency of social initiatives, improving the way these actions help humanity in general and people in need in particular. This application field, however, poses its own barriers and challenges when developing data-driven projects, lagging behind in comparison with other scenarios. These challenges derive from aspects such as the scope and scale of the social issue to solve, cultural and political barriers, the skills of main stakeholders and the technological resources available, the motivation to be engaged in such projects, or the ethical and legal issues related to sensitive data. This paper analyzes the application of data projects in the field of social good, reviewing its current state and noteworthy initiatives, and presenting a framework covering the key aspects to analyze in such projects. The goal is to provide guidelines to understand the main challenges and opportunities for this type of data project, as well as identifying the main differential issues compared to “classical” data projects in general. A case study is presented on the initial steps and stakeholder analysis of a data project for the inclusion of refugees in the city of Frankfurt, Germany, in order to empirically confront the framework with a real example.Keywords: Data-Driven projects, humanitarian operations, personal and sensitive data, social good, stakeholders analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17932 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion detection system (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw dataset for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle component analysis (PCA), Linear Discriminant Analysis (LDA) and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. This optimal feature subset is used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) are used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.
Keywords: Particle Swarm Optimization (PSO), Principle component analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27641 Construction Noise Management: Hong Kong Reviews and International Best Practices
Authors: Morgan Cheng, Wilson Ho, Max Yiu, Dragon Tsui, Wylog Wong, Yasir A. Naveed, C. S. Loong, Richard Kwan, K. C. Lam, Hannah Lo, C. L. Wong
Abstract:
Hong Kong is known worldwide for high density living and the ability to thrive under trying circumstances. The 7.5 million residents of this busy metropolis live primarily in high-rise buildings which are built and demolished incessantly. Hong Kong residents are therefore affected continuously by numerous construction activities. In 2020, the Hong Kong Environmental Protection Department (EPD) commissioned a feasibility study on the management of construction noise, including those associated with renovation of domestic premises. A key component of the study focused on the review of practices concerning the management and control of construction noise in metropolitans in other parts of the world. To benefit from international best practices, this extensive review aimed at identifying possible areas of improvement in Hong Kong. The study first referred to the United Nations “The World’s Cities in 2016” Report and examined the top 100 cities therein. The 20 most suitable cities were then chosen for further review. Upon further screening, 12 cities with more relevant management practices were selected for further scrutiny. These 12 cities include: Asia – Tokyo, Seoul, Taipei, Guangzhou, Singapore; Europe – City of Westminster (London), Berlin; North America – Toronto, New York City, San Francisco; Oceania – Sydney, Melbourne. Subsequently, three cities, namely Sydney, City of Westminster, and New York City, were selected for in-depth review. These three were chosen primarily because of the maturity, success, and effectiveness of their construction noise management and control measures, as well as their similarity to Hong Kong in certain key aspects. One of the more important findings of the review is the importance of early focus on potential noise issues, with the objective of designing the noise away wherever practicable. The study examined the similar yet different construction noise early focus mechanisms of these three cities. This paper describes this landmark, worldwide and extensive review on international best construction noise management and control practices at the source, along the noise transmission path and at the receiver end. The methodology, approach, and key findings are presented succinctly in this paper. By sharing the findings with the acoustics professionals worldwide, it is hoped that more advanced and mature construction noise management practices can be developed to attain urban sustainability.
Keywords: construction noise, international best practices, noise control and noise management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543