Search results for: Syed Najmul Hejaz Azmi
39 Multidimensional Visualization Tools for Analysis of Expression Data
Authors: Urska Cvek, Marjan Trutschl, Randolph Stone II, Zanobia Syed, John L. Clifford, Anita L. Sabichi
Abstract:
Expression data analysis is based mostly on the statistical approaches that are indispensable for the study of biological systems. Large amounts of multidimensional data resulting from the high-throughput technologies are not completely served by biostatistical techniques and are usually complemented with visual, knowledge discovery and other computational tools. In many cases, in biological systems we only speculate on the processes that are causing the changes, and it is the visual explorative analysis of data during which a hypothesis is formed. We would like to show the usability of multidimensional visualization tools and promote their use in life sciences. We survey and show some of the multidimensional visualization tools in the process of data exploration, such as parallel coordinates and radviz and we extend them by combining them with the self-organizing map algorithm. We use a time course data set of transitional cell carcinoma of the bladder in our examples. Analysis of data with these tools has the potential to uncover additional relationships and non-trivial structures.Keywords: microarrays, visualization, parallel coordinates, radviz, self-organizing maps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 250838 Approximate Bounded Knowledge Extraction Using Type-I Fuzzy Logic
Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil
Abstract:
Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Keywords: Crisp neural networks, fuzzy systems, extraction of logical rules, quasi-fuzzy numbers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174037 PIIN Suppression Using Random Diagonal Code for Spectral Amplitude Coding Optical CDMA System
Authors: Hilal Adnan Fadhil, Syed Alwei, R. Badlishah Ahmad
Abstract:
A new code for spectral-amplitude coding optical code-division multiple-access system is proposed called Random diagonal (RD) code. This code is constructed using code segment and data segment. One of the important properties of this code is that the cross correlation at data segment is always zero, which means that Phase Intensity Induced Noise (PIIN) is reduced. For the performance analysis, the effects of phase-induced intensity noise, shot noise, and thermal noise are considered simultaneously. Bit-error rate (BER) performance is compared with Hadamard and Modified Frequency Hopping (MFH) codes. It is shown that the system using this new code matrices not only suppress PIIN, but also allows larger number of active users compare with other codes. Simulation results shown that using point to point transmission with three encoded channels, RD code has better BER performance than other codes, also its found that at 0 dbm PIIN noise are 10-10 and 10-11 for RD and MFH respectively.Keywords: OCDMA, MFH, PIIN, and BER.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178736 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning
Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul
Abstract:
In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.Keywords: Electrocardiogram, dictionary learning, sparse coding, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209335 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions
Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad
Abstract:
This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.
Keywords: Fragility analysis, seismic performance, tunnel lining, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139034 Multi Antenna Systems for 5G Mobile Phones
Authors: Muhammad N. Khan, Syed O. Gillani, Mohsin Jamil, Tarbia Iftikhar
Abstract:
With the increasing demand of bandwidth and data rate, there is a dire need to implement antenna systems in mobile phones which are able to fulfill user requirements. A monopole antenna system with multi-antennas configurations is proposed considering the feasibility and user demand. The multi-antenna structure is referred to as multi-input multi-output (MIMO) antenna system. The multi-antenna system comprises of 4 antennas operating below 6 GHz frequency bands for 4G/LTE and 4 antenna for 5G applications at 28 GHz and the dimension of board is 120 × 70 × 0.8mm3. The suggested designs is feasible with a structure of low-profile planar-antenna and is adaptable to smart cell phones and handheld devices. To the best of our knowledge, this is the first design compared to the literature by having integrated antenna system for two standards, i.e., 4G and 5G. All MIMO antenna systems are simulated on commercially available software, which is high frequency structures simulator (HFSS).Keywords: High frequency structures simulator (HFSS), mutli-input multi-output (MIMO), monopole antenna, slot antenna.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189133 Smart Surveillance using PDA
Authors: Basem Mustafa Abd. Amer , Syed Abdul Rahman Al-Attas
Abstract:
The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS techniqueKeywords: Surveillance, PDA, Motion Detection, ImageProcessing , Background Subtraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175932 Impact of Brand Origin on Brand Loyalty: A Case of Personal Care Products in Pakistan
Authors: Aimen Batool Bint-E-Rashid, Syed Muhammad Dawood Ali Shah, Muhammad Usman Farooq, Mahgul Anwar
Abstract:
As the world is progressing, the needs and demands of the consumer market are also changing. Nowadays the trends of consumer purchase decisions are dependent upon multiple factors. This study aims to identify the influential impact of country of origin over the perception and devotion towards daily personal care products specifically in reference to the knowledge and awareness regarding that particular brand in Pakistan. To corroborate this study, a 30-item brand origin questionnaire has been used with 300 purchase decision makers belonging to different age groups. To illustrate this study, a model has been developed based on brand origin, brand awareness and brand loyalty. Correlation and regression analysis have been used to find out the results which conclude the findings on the perspective of Pakistan’s consumer market as that brand origin has a direct relationship with brand loyalty provided that the consumer has a positive brand awareness. Support for the fact that brand origin impacts brand loyalty through brand awareness has been presented in this study.
Keywords: Brand awareness, brand loyalty, brand origin, personal care products, P&G, Unilever.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 110331 Simplified Space Vector Based Decoupled Switching Strategy for Indirect Vector Controlled Open-End Winding Induction Motor Drive
Authors: Syed Munvar Ali, V. Vijaya Kumar Reddy, M. Surya Kalavathi
Abstract:
In this paper, a dual inverter configuration has been implemented for induction motor drive. This isolated dual inverter is capable to produce high quality of output voltage and minimize common mode voltage (CMV). To this isolated dual inverter a decoupled space vector based pulse width modulation (PWM) technique is proposed. Conventional space vector based PWM (SVPWM) techniques require reference voltage vector calculation and sector identification. The proposed decoupled SVPWM technique generates gating pulses from instantaneous phase voltages and gives a CMV of ±vdc/6. To evaluate proposed algorithm MATLAB based simulation studies are carried on indirect vector controlled open end winding induction motor drive.Keywords: Inverter configuration, decoupled SVPWM, common mode voltage, vector control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73330 An Optimization Analysis on an Automotive Component with Fatigue Constraint Using HyperWorks Software for Environmental Sustainability
Authors: W. M. Wan Muhamad, E. Sujatmika, M.R. Idris, S.A. Syed Ahmad
Abstract:
A finite element analysis (FEA) computer software HyperWorks is utilized in re-designing an automotive component to reduce its mass. Reduction of components mass contributes towards environmental sustainability by saving world-s valuable metal resources and by reducing carbon emission through improved overall vehicle fuel efficiency. A shape optimization analysis was performed on a rear spindle component. Pre-processing and solving procedures were performed using HyperMesh and RADIOSS respectively. Shape variables were defined using HyperMorph. Then optimization solver OptiStruct was utilized with fatigue life set as a design constraint. Since Stress-Number of Cycle (S-N) theory deals with uni-axial stress, the Signed von Misses stress on the component was used for looking up damage on S-N curve, and Gerber criterion for mean stress corrections. The optimization analysis resulted in mass reduction of 24% of the original mass. The study proved that the adopted approach has high potential use for environmental sustainability.
Keywords: Environmental Sustainability, Shape Optimization, Fatigue, Rear Spindle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 429129 Hazard Contributing Factors Classification for Petrol Fuel Station
Authors: Mirza Munir Ahmed, S.R.M. Kutty, Mohd Faris Khamidi, Idris Othman, Azmi Mohd Shariff
Abstract:
Petrol Fuel Station (PFS) has potential hazards to the people, asset, environment and reputation of an operating company. Fire hazards, static electricity air pollution evoked by aliphatic and aromatic organic compounds are major causes of accident/incident occurrence at fuel station. Activities such as carelessness, maintenance, housekeeping, slips trips and falls, transportation hazard, major and minor injuries, robbery and snake bites has a potential to create unsafe conditions. The level of risk of these hazards varies according to location and country. The emphasis on safety considerations by the government is variable all around the world. Developed countries safety records are much better as compared to developing countries safety statistics. There is no significant approach available to highlight the unsafe acts and unsafe conditions during operation and maintenance of fuel station. Fuel station is the most commonly available facilities that contain flammable and hazardous materials. Due to continuous operation of fuel station they pose various hazards to people, environment and assets of an organization. To control these hazards, there is a need for specific approach. PFS operation is unique as compared to other businesses. For smooth operations it demands an involvement of operating company, contractor and operator group. This study will focus to address hazard contributing factors that have a potential to make PFS operation risky. One year data collected, 902 activities analyzed, comparisons were made to highlight significant contributing factors. The study will provide help and assistance to PFS outlet marketing companies to make their fuel station operation safer. It will help health safety and environment (HSE) professionals to arrest the gap available related to safety matters at PFS.Keywords: Accident, Contributing factors, carelessness, fire, explosion, injuries.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 738328 Implementing Knowledge Transfer Solution through Web-based Help Desk System
Authors: Mazeyanti M. Ariffin, Noreen Izza Arshad, Ainol Rahmah Shaarani, Syed Uzair Shah
Abstract:
Knowledge management is a process taking any steps that needed to get the most out of available knowledge resources. KM involved several steps; capturing the knowledge discovering new knowledge, sharing the knowledge and applied the knowledge in the decision making process. In applying the knowledge, it is not necessary for the individual that use the knowledge to comprehend it as long as the available knowledge is used in guiding the decision making and actions. When an expert is called and he provides stepby- step procedure on how to solve the problems to the caller, the expert is transferring the knowledge or giving direction to the caller. And the caller is 'applying' the knowledge by following the instructions given by the expert. An appropriate mechanism is needed to ensure effective knowledge transfer which in this case is by telephone or email. The problem with email and telephone is that the knowledge is not fully circulated and disseminated to all users. In this paper, with related experience of local university Help Desk, it is proposed the usage of Information Technology (IT)to effectively support the knowledge transfer in the organization. The issues covered include the existing knowledge, the related works, the methodology used in defining the knowledge management requirements as well the overview of the prototype.Keywords: Knowledge Management, Knowledge Transfer, Help Desk, Web-based system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178127 Assessing the Corporate Identity of Malaysia Universities in the East Coast Region with the Market Conditions in Ensuring Self-Sustainability: A Study on Universiti Sultan Zainal Abidin
Authors: Suffian H. Ayub, Mohammed R. Hamzah, Nor H. Abdullah, Sharipah N. Syed Azmy, Hishammudin S.
Abstract:
The liberalisation of the education industry has exposed the institute of higher learning (IHL) in Malaysia to the financial challenges. Without good financial standing, public institution will rely on the government funding. Ostensibly, this contradicts with the government’s aspiration to make universities self-sufficient. With stiff competition from private institutes of higher learning, IHL need to be prepared at the forefront level. The corporate identity itself is the entrance to the world of higher learning and it is in this uniqueness, it will be able to distinguish itself from competitors. This paper examined the perception of the stakeholders at one of the public universities in the east coast region in Malaysia on the perceived reputation and how the university communicate its preparedness for self-sustainability through corporate identity. The findings indicated while the stakeholders embraced the challenges in facing the stiff competition and struggling market conditions, most of them felt the university should put more efforts in mobilising the corporate identity to its constituencies.Keywords: Communication, corporate identity, market conditions, universities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194526 Automated Separation of Organic Liquids through Their Boiling Points
Authors: Muhammad Tahir Qadri, Syed Shafi-Uddin Qadri, Faizan Farid, Nabeel Abid
Abstract:
This paper discuss the separation of the miscible liquids by means of fractional distillation. For complete separation of liquids, the process of heating, condensation, separation and storage is done automatically to achieve the objective. PIC micro-controller has been used to control each and every process of the work. The controller also controls the storage process by activating and deactivating the conveyors. The liquids are heated which on reaching their respective boiling points evaporate and enter the condensation chamber where they convert into liquid. The liquids are then directed to their respective tanks by means of stepper motor which moves in three directions, each movement into different tank. The tank on filling sends the signal to controller which then opens the solenoid valves. The tank is emptied into the beakers below the nozzle. As the beaker filled, the nozzle closes and the conveyors come into operation. The filled beaker is replaced by an empty beaker from behind. The work can be used in oil industries, chemical industries and paint industries.Keywords: Miscible Liquid Separation Unit, Distillation, Waste Water Treatment, Organic Liquids Collection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174325 A Real-Time Specific Weed Recognition System Using Statistical Methods
Authors: Imran Ahmed, Muhammad Islam, Syed Inayat Ali Shah, Awais Adnan
Abstract:
The identification and classification of weeds are of major technical and economical importance in the agricultural industry. To automate these activities, like in shape, color and texture, weed control system is feasible. The goal of this paper is to build a real-time, machine vision weed control system that can detect weed locations. In order to accomplish this objective, a real-time robotic system is developed to identify and locate outdoor plants using machine vision technology and pattern recognition. The algorithm is developed to classify images into broad and narrow class for real-time selective herbicide application. The developed algorithm has been tested on weeds at various locations, which have shown that the algorithm to be very effectiveness in weed identification. Further the results show a very reliable performance on weeds under varying field conditions. The analysis of the results shows over 90 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.Keywords: Weed detection, Image Processing, real-timerecognition, Standard Deviation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 226424 Numerical Analysis and Experimental Validation of a Downhole Stress/Strain Measurement Tool
Authors: Abhay Bodake, Ping Sui, Hafeez Syed, Ratish Kadam
Abstract:
Real-time measurement of applied forces, like tension, compression, torsion, and bending moment, identifies the transferred energies being applied to the bottomhole assembly (BHA). These forces are highly detrimental to measurement/logging-while-drilling tools and downhole equipment. Real-time measurement of the dynamic downhole behavior, including weight, torque, bending on bit, and vibration, establishes a real-time feedback loop between the downhole drilling system and drilling team at the surface. This paper describes the numerical analysis of the strain data acquired by the measurement tool at different locations on the strain pockets. The strain values obtained by FEA for various loading conditions (tension, compression, torque, and bending moment) are compared against experimental results obtained from an identical experimental setup. Numerical analyses results agree with experimental data within 8% and, therefore, substantiate and validate the FEA model. This FEA model can be used to analyze the combined loading conditions that reflect the actual drilling environment.
Keywords: FEA, M/LWD, Oil & Gas, Strain Measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 258523 Fire Resistance of High Alumina Cement and Slag Based Ultra High Performance Fibre-Reinforced Cementitious Composites
Authors: A. Q. Sobia, M. S. Hamidah, I. Azmi, S. F. A. Rafeeqi
Abstract:
Fibre-reinforced polymer (FRP) strengthened reinforced concrete (RC) structures are susceptible to intense deterioration when exposed to elevated temperatures, particularly in the incident of fire. FRP has the tendency to lose bond with the substrate due to the low glass transition temperature of epoxy; the key component of FRP matrix. In the past few decades, various types of high performance cementitious composites (HPCC) were explored for the protection of RC structural members against elevated temperature. However, there is an inadequate information on the influence of elevated temperature on the ultra high performance fibre-reinforced cementitious composites (UHPFRCC) containing ground granulated blast furnace slag (GGBS) as a replacement of high alumina cement (HAC) in conjunction with hybrid fibres (basalt and polypropylene fibres), which could be a prospective fire resisting material for the structural components. The influence of elevated temperatures on the compressive as well as flexural strength of UHPFRCC, made of HAC-GGBS and hybrid fibres, were examined in this study. Besides control sample (without fibres), three other samples, containing 0.5%, 1% and 1.5% of basalt fibres by total weight of mix and 1 kg/m3 of polypropylene fibres, were prepared and tested. Another mix was also prepared with only 1 kg/m3 of polypropylene fibres. Each of the samples were retained at ambient temperature as well as exposed to 400, 700 and 1000 °C followed by testing after 28 and 56 days of conventional curing. Investigation of results disclosed that the use of hybrid fibres significantly helped to improve the ambient temperature compressive and flexural strength of UHPFRCC, which was found to be 80 and 14.3 MPa respectively. However, the optimum residual compressive strength was marked by UHPFRCC-CP (with polypropylene fibres only), equally after both curing days (28 and 56 days), i.e. 41%. In addition, the utmost residual flexural strength, after 28 and 56 days of curing, was marked by UHPFRCC– CP and UHPFRCC– CB2 (1 kg/m3 of PP fibres + 1% of basalt fibres) i.e. 39% and 48.5% respectively.
Keywords: Fibre reinforced polymer materials, ground granulated blast furnace slag, high-alumina cement, hybrid fibres.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 113922 Digital Encoder Based Power Frequency Deviation Measurement
Authors: Syed Javed Arif, Mohd Ayyub Khan, Saleem Anwar Khan
Abstract:
In this paper, a simple method is presented for measurement of power frequency deviations. A phase locked loop (PLL) is used to multiply the signal under test by a factor of 100. The number of pulses in this pulse train signal is counted over a stable known period, using decade driving assemblies (DDAs) and flip-flops. These signals are combined using logic gates and then passed through decade counters to give a unique combination of pulses or levels, which are further encoded. These pulses are equally suitable for both control applications and display units. The experimental circuit developed gives a resolution of 1 Hz within the measurement period of 20 ms. The proposed circuit is also simulated in Verilog Hardware Description Language (VHDL) and implemented using Field Programing Gate Arrays (FPGAs). A Mixed signal Oscilloscope (MSO) is used to observe the results of FPGA implementation. These results are compared with the results of the proposed circuit of discrete components. The proposed system is useful for frequency deviation measurement and control in power systems.
Keywords: Frequency measurement, digital control, phase locked loop, encoding, Verilog HDL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 61421 FPGA Based Parallel Architecture for the Computation of Third-Order Cross Moments
Authors: Syed Manzoor Qasim, Shuja Abbasi, Saleh Alshebeili, Bandar Almashary, Ateeq Ahmad Khan
Abstract:
Higher-order Statistics (HOS), also known as cumulants, cross moments and their frequency domain counterparts, known as poly spectra have emerged as a powerful signal processing tool for the synthesis and analysis of signals and systems. Algorithms used for the computation of cross moments are computationally intensive and require high computational speed for real-time applications. For efficiency and high speed, it is often advantageous to realize computation intensive algorithms in hardware. A promising solution that combines high flexibility together with the speed of a traditional hardware is Field Programmable Gate Array (FPGA). In this paper, we present FPGA-based parallel architecture for the computation of third-order cross moments. The proposed design is coded in Very High Speed Integrated Circuit (VHSIC) Hardware Description Language (VHDL) and functionally verified by implementing it on Xilinx Spartan-3 XC3S2000FG900-4 FPGA. Implementation results are presented and it shows that the proposed design can operate at a maximum frequency of 86.618 MHz.Keywords: Cross moments, Cumulants, FPGA, Hardware Implementation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173520 Process and Supply-Chain Optimization for Testing and Verification of Formation Tester/Pressure-While- Drilling Tools
Authors: Vivek V, Hafeez Syed, Darren W Terrell, Harit Naik, Halliburton
Abstract:
Applying a rigorous process to optimize the elements of a supply-chain network resulted in reduction of the waiting time for a service provider and customer. Different sources of downtime of hydraulic pressure controller/calibrator (HPC) were causing interruptions in the operations. The process examined all the issues to drive greater efficiencies. The issues included inherent design issues with HPC pump, contamination of the HPC with impurities, and the lead time required for annual calibration in the USA. HPC is used for mandatory testing/verification of formation tester/pressure measurement/logging-while drilling tools by oilfield service providers, including Halliburton. After market study andanalysis, it was concluded that the current HPC model is best suited in the oilfield industry. To use theexisting HPC model effectively, design andcontamination issues were addressed through design and process improvements. An optimum network is proposed after comparing different supply-chain models for calibration lead-time reduction.Keywords: Hydraulic Pressure Controller/Calibrator, M/LWD, Pressure, FTWD
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145319 Support Vector Machine based Intelligent Watermark Decoding for Anticipated Attack
Authors: Syed Fahad Tahir, Asifullah Khan, Abdul Majid, Anwar M. Mirza
Abstract:
In this paper, we present an innovative scheme of blindly extracting message bits from an image distorted by an attack. Support Vector Machine (SVM) is used to nonlinearly classify the bits of the embedded message. Traditionally, a hard decoder is used with the assumption that the underlying modeling of the Discrete Cosine Transform (DCT) coefficients does not appreciably change. In case of an attack, the distribution of the image coefficients is heavily altered. The distribution of the sufficient statistics at the receiving end corresponding to the antipodal signals overlap and a simple hard decoder fails to classify them properly. We are considering message retrieval of antipodal signal as a binary classification problem. Machine learning techniques like SVM is used to retrieve the message, when certain specific class of attacks is most probable. In order to validate SVM based decoding scheme, we have taken Gaussian noise as a test case. We generate a data set using 125 images and 25 different keys. Polynomial kernel of SVM has achieved 100 percent accuracy on test data.Keywords: Bit Correct Ratio (BCR), Grid Search, Intelligent Decoding, Jackknife Technique, Support Vector Machine (SVM), Watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167018 Synthesis and Characterization of ZnO and Fe3O4 Nanocrystals from Oleat-based Organometallic Compounds
Authors: PoiSim Khiew, WeeSiong Chiu, ThianKhoonTan, Shahidan Radiman, Roslan Abd-Shukor, Muhammad Azmi Abd-Hamid, ChinHua Chia
Abstract:
Magnetic and semiconductor nanomaterials exhibit novel magnetic and optical properties owing to their unique size and shape-dependent effects. With shrinking the size down to nanoscale region, various anomalous properties that normally not present in bulk start to dominate. Ability in harnessing of these anomalous properties for the design of various advance electronic devices is strictly dependent on synthetic strategies. Hence, current research has focused on developing a rational synthetic control to produce high quality nanocrystals by using organometallic approach to tune both size and shape of the nanomaterials. In order to elucidate the growth mechanism, transmission electron microscopy was employed as a powerful tool in performing real time-resolved morphologies and structural characterization of magnetic (Fe3O4) and semiconductor (ZnO) nanocrystals. The current synthetic approach is found able to produce nanostructures with well-defined shapes. We have found that oleic acid is an effective capping ligand in preparing oxide-based nanostructures without any agglomerations, even at high temperature. The oleate-based precursors and capping ligands are fatty acid compounds, which are respectively originated from natural palm oil with low toxicity. In comparison with other synthetic approaches in producing nanostructures, current synthetic method offers an effective route to produce oxide-based nanomaterials with well-defined shapes and good monodispersity. The nanocystals are well-separated with each other without any stacking effect. In addition, the as-synthesized nanopellets are stable in terms of chemically and physically if compared to those nanomaterials that are previous reported. Further development and extension of current synthetic strategy are being pursued to combine both of these materials into nanocomposite form that will be used as “smart magnetic nanophotocatalyst" for industry waste water treatment.Keywords: Metal oxide nanomaterials, Nanophotocatalyst, Organometallic synthesis, Morphology Control
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 259317 A High-Speed and Low-Energy Ternary Content Addressable Memory Design Using Feedback in Match-Line Sense Amplifier
Authors: Syed Iftekhar Ali, M. S. Islam
Abstract:
In this paper we present an energy efficient match-line (ML) sensing scheme for high-speed ternary content-addressable memory (TCAM). The proposed scheme isolates the sensing unit of the sense amplifier from the large and variable ML capacitance. It employs feedback in the sense amplifier to successfully detect a match while keeping the ML voltage swing low. This reduced voltage swing results in large energy saving. Simulation performed using 130nm 1.2V CMOS logic shows at least 30% total energy saving in our scheme compared to popular current race (CR) scheme for similar search speed. In terms of speed, dynamic energy, peak power consumption and transistor count our scheme also shows better performance than mismatch-dependant (MD) power allocation technique which also employs feedback in the sense amplifier. Additionally, the implementation of our scheme is simpler than CR or MD scheme because of absence of analog control voltage and programmable delay circuit as have been used in those schemes.Keywords: content-addressable memory, energy consumption, feedback, peak power, sensing scheme, sense amplifier, ternary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182116 Effect of Atmospheric Turbulence on Hybrid FSO/RF Link Availability under Qatar Harsh Climate
Authors: Abir Touati, Syed Jawad Hussain, Farid Touati, Ammar Bouallegue
Abstract:
Although there has been a growing interest in the hybrid free-space optical link and radio frequency FSO/RF communication system, the current literature is limited to results obtained in moderate or cold environment. In this paper, using a soft switching approach, we investigate the effect of weather inhomogeneities on the strength of turbulence hence the channel refractive index under Qatar harsh environment and their influence on the hybrid FSO/RF availability. In this approach, either FSO/RF or simultaneous or none of them can be active. Based on soft switching approach and a finite state Markov Chain (FSMC) process, we model the channel fading for the two links and derive a mathematical expression for the outage probability of the hybrid system. Then, we evaluate the behavior of the hybrid FSO/RF under hazy and harsh weather. Results show that the FSO/RF soft switching renders the system outage probability less than that of each link individually. A soft switching algorithm is being implemented on FPGAs using Raptor code interfaced to the two terminals of a 1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented in the region. Experimental results are compared to the above simulation results.Keywords: Atmospheric turbulence, haze, soft switching, Raptor codes, refractive index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 257715 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.
Keywords: Big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 214314 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs
Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao
Abstract:
A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.
Keywords: Tight reservoirs, cyclic O2 injection, asphaltene, solubility, reservoir simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181913 Serum Nitric Oxide and Sialic Acid: Possible Biochemical Markers for Progression of Diabetic Nephropathy
Authors: Syed M. Shahid, Rozeena Shaikh, Syeda N. Nawab, Shah A. Qader, Abid Azhar, Tabassum Mahboob
Abstract:
This study was designed to investigate the role of serum nitric oxide and sialic acid in the development of diabetic nephropathy as disease marker. Total 210 diabetic patients (age and sex matched) were selected followed by informed consent and divided into four groups (70 each) as I: control; II: diabetic; III: diabetic hypertensive; IV: diabetic nephropathy. The blood samples of all subjects were collected and analyzed for serum nitric oxide, sialic acid, fasting blood glucose, serum urea, creatinine, HbA1c and GFR. The BMI, systolic and diastolic blood pressures, blood glucose, HbA1c and serum sialic acid levels were high (p<0.01) in group II as compared to control subjects. The higher levels (p<0.01) of BMI, systolic and diastolic blood pressures, blood glucose, HbA1c, serum urea, creatinine and sialic acid were observed in group III and IV as compared to controls. Significantly low levels of GFR and serum nitric oxide (p<0.01) were observed in group III and IV as compared to controls. Results indicated that serum nitric oxide and sialic acid are the major biochemical indicators for micro and macrovascular complications of diabetes such as hypertension and nephropathy. These should be taken into account during screening procedures regarding identifications of the diabetic patients to get them rid of progressive renal impairment to ESRD.
Keywords: Diabetic nephropathy, hypertension, nitric oxide, sialic acid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169112 EZW Coding System with Artificial Neural Networks
Authors: Saudagar Abdul Khader Jilani, Syed Abdul Sattar
Abstract:
Image compression plays a vital role in today-s communication. The limitation in allocated bandwidth leads to slower communication. To exchange the rate of transmission in the limited bandwidth the Image data must be compressed before transmission. Basically there are two types of compressions, 1) LOSSY compression and 2) LOSSLESS compression. Lossy compression though gives more compression compared to lossless compression; the accuracy in retrievation is less in case of lossy compression as compared to lossless compression. JPEG, JPEG2000 image compression system follows huffman coding for image compression. JPEG 2000 coding system use wavelet transform, which decompose the image into different levels, where the coefficient in each sub band are uncorrelated from coefficient of other sub bands. Embedded Zero tree wavelet (EZW) coding exploits the multi-resolution properties of the wavelet transform to give a computationally simple algorithm with better performance compared to existing wavelet transforms. For further improvement of compression applications other coding methods were recently been suggested. An ANN base approach is one such method. Artificial Neural Network has been applied to many problems in image processing and has demonstrated their superiority over classical methods when dealing with noisy or incomplete data for image compression applications. The performance analysis of different images is proposed with an analysis of EZW coding system with Error Backpropagation algorithm. The implementation and analysis shows approximately 30% more accuracy in retrieved image compare to the existing EZW coding system.Keywords: Accuracy, Compression, EZW, JPEG2000, Performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 193311 Web Content Mining: A Solution to Consumer's Product Hunt
Authors: Syed Salman Ahmed, Zahid Halim, Rauf Baig, Shariq Bashir
Abstract:
With the rapid growth in business size, today's businesses orient towards electronic technologies. Amazon.com and e-bay.com are some of the major stakeholders in this regard. Unfortunately the enormous size and hugely unstructured data on the web, even for a single commodity, has become a cause of ambiguity for consumers. Extracting valuable information from such an everincreasing data is an extremely tedious task and is fast becoming critical towards the success of businesses. Web content mining can play a major role in solving these issues. It involves using efficient algorithmic techniques to search and retrieve the desired information from a seemingly impossible to search unstructured data on the Internet. Application of web content mining can be very encouraging in the areas of Customer Relations Modeling, billing records, logistics investigations, product cataloguing and quality management. In this paper we present a review of some very interesting, efficient yet implementable techniques from the field of web content mining and study their impact in the area specific to business user needs focusing both on the customer as well as the producer. The techniques we would be reviewing include, mining by developing a knowledge-base repository of the domain, iterative refinement of user queries for personalized search, using a graphbased approach for the development of a web-crawler and filtering information for personalized search using website captions. These techniques have been analyzed and compared on the basis of their execution time and relevance of the result they produced against a particular search.
Keywords: Data mining, web mining, search engines, knowledge discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205310 An Investigation of Surface Texturing by Ultrasonic Impingement of Micro-Particles
Authors: Nagalingam Arun Prasanth, Ahmed Syed Adnan, S. H. Yeo
Abstract:
Surface topography plays a significant role in the functional performance of engineered parts. It is important to have a control on the surface geometry and understanding on the surface details to get the desired performance. Hence, in the current research contribution, a non-contact micro-texturing technique has been explored and developed. The technique involves ultrasonic excitation of a tool as a prime source of surface texturing for aluminum alloy workpieces. The specimen surface is polished first and is then immersed in a liquid bath containing 10% weight concentration of Ti6Al4V grade 5 spherical powders. A submerged slurry jet is used to recirculate the spherical powders under the ultrasonic horn which is excited at an ultrasonic frequency and amplitude of 40 kHz and 70 µm respectively. The distance between the horn and workpiece surface was remained fixed at 200 µm using a precision control stage. Texturing effects were investigated for different process timings of 1, 3 and 5 s. Thereafter, the specimens were cleaned in an ultrasonic bath for 5 mins to remove loose debris on the surface. The developed surfaces are characterized by optical and contact surface profiler. The optical microscopic images show a texture of circular spots on the workpiece surface indented by titanium spherical balls. Waviness patterns obtained from contact surface profiler supports the texturing effect produced from the proposed technique. Furthermore, water droplet tests were performed to show the efficacy of the proposed technique to develop hydrophilic surfaces and to quantify the texturing effect produced.
Keywords: Surface texturing, surface modification, topography, ultrasonic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 964