Search results for: automated quantification
633 IT Perspective of Service-Oriented e-Government Enterprise
Authors: Anu Paul, Varghese Paul
Abstract:
The focal aspire of e-Government (eGovt) is to offer citizen-centered service delivery. Accordingly, the citizenry consumes services from multiple government agencies through national portal. Thus, eGovt is an enterprise with the primary business motive of transparent, efficient and effective public services to its citizenry and its logical structure is the eGovernment Enterprise Architecture (eGEA). Since eGovt is IT oriented multifaceted service-centric system, EA doesn’t do much on an automated enterprise other than the business artifacts. Service-Oriented Architecture (SOA) manifestation led some governments to pertain this in their eGovts, but it limits the source of business artifacts. The concurrent use of EA and SOA in eGovt executes interoperability and integration and leads to Service-Oriented e-Government Enterprise (SOeGE). Consequently, agile eGovt system becomes a reality. As an IT perspective eGovt comprises of centralized public service artifacts with the existing application logics belong to various departments at central, state and local level. The eGovt is renovating to SOeGE by apply the Service-Orientation (SO) principles in the entire system. This paper explores IT perspective of SOeGE in India which encompasses the public service models and illustrated with a case study the Passport service of India.Keywords: enterprise architecture, service-oriented e-Government enterprise, service interface layer, service model
Procedia PDF Downloads 521632 SPR Immunosensor for the Detection of Staphylococcus aureus
Authors: Muhammad Ali Syed, Arshad Saleem Bhatti, Chen-zhong Li, Habib Ali Bokhari
Abstract:
Surface plasmon resonance (SPR) biosensors have emerged as a promising technique for bioanalysis as well as microbial detection and identification. Real time, sensitive, cost effective, and label free detection of biomolecules from complex samples is required for early and accurate diagnosis of infectious diseases. Like many other types of optical techniques, SPR biosensors may also be successfully utilized for microbial detection for accurate, point of care, and rapid results. In the present study, we have utilized a commercially available automated SPR biosensor of BI company to study the microbial detection form water samples spiked with different concentration of Staphylococcus aureus bacterial cells. The gold thin film sensor surface was functionalized to react with proteins such as protein G, which was used for directed immobilization of monoclonal antibodies against Staphylococcus aureus. The results of our work reveal that this immunosensor can be used to detect very small number of bacterial cells with higher sensitivity and specificity. In our case 10^3 cells/ml of water have been successfully detected. Therefore, it may be concluded that this technique has a strong potential to be used in microbial detection and identification.Keywords: surface plasmon resonance (SPR), Staphylococcus aureus, biosensors, microbial detection
Procedia PDF Downloads 475631 Automated Ultrasound Carotid Artery Image Segmentation Using Curvelet Threshold Decomposition
Authors: Latha Subbiah, Dhanalakshmi Samiappan
Abstract:
In this paper, we propose denoising Common Carotid Artery (CCA) B mode ultrasound images by a decomposition approach to curvelet thresholding and automatic segmentation of the intima media thickness and adventitia boundary. By decomposition, the local geometry of the image, its direction of gradients are well preserved. The components are combined into a single vector valued function, thus removes noise patches. Double threshold is applied to inherently remove speckle noise in the image. The denoised image is segmented by active contour without specifying seed points. Combined with level set theory, they provide sub regions with continuous boundaries. The deformable contours match to the shapes and motion of objects in the images. A curve or a surface under constraints is developed from the image with the goal that it is pulled into the necessary features of the image. Region based and boundary based information are integrated to achieve the contour. The method treats the multiplicative speckle noise in objective and subjective quality measurements and thus leads to better-segmented results. The proposed denoising method gives better performance metrics compared with other state of art denoising algorithms.Keywords: curvelet, decomposition, levelset, ultrasound
Procedia PDF Downloads 340630 An Automated System for the Detection of Citrus Greening Disease Based on Visual Descriptors
Authors: Sidra Naeem, Ayesha Naeem, Sahar Rahim, Nadia Nawaz Qadri
Abstract:
Citrus greening is a bacterial disease that causes considerable damage to citrus fruits worldwide. Efficient method for this disease detection must be carried out to minimize the production loss. This paper presents a pattern recognition system that comprises three stages for the detection of citrus greening from Orange leaves: segmentation, feature extraction and classification. Image segmentation is accomplished by adaptive thresholding. The feature extraction stage comprises of three visual descriptors i.e. shape, color and texture. From shape feature we have used asymmetry index, from color feature we have used histogram of Cb component from YCbCr domain and from texture feature we have used local binary pattern. Classification was done using support vector machines and k nearest neighbors. The best performances of the system is Accuracy = 88.02% and AUROC = 90.1% was achieved by automatic segmented images. Our experiments validate that: (1). Segmentation is an imperative preprocessing step for computer assisted diagnosis of citrus greening, and (2). The combination of shape, color and texture features form a complementary set towards the identification of citrus greening disease.Keywords: citrus greening, pattern recognition, feature extraction, classification
Procedia PDF Downloads 184629 A TgCNN-Based Surrogate Model for Subsurface Oil-Water Phase Flow under Multi-Well Conditions
Authors: Jian Li
Abstract:
The uncertainty quantification and inversion problems of subsurface oil-water phase flow usually require extensive repeated forward calculations for new runs with changed conditions. To reduce the computational time, various forms of surrogate models have been built. Related research shows that deep learning has emerged as an effective surrogate model, while most surrogate models with deep learning are purely data-driven, which always leads to poor robustness and abnormal results. To guarantee the model more consistent with the physical laws, a coupled theory-guided convolutional neural network (TgCNN) based surrogate model is built to facilitate computation efficiency under the premise of satisfactory accuracy. The model is a convolutional neural network based on multi-well reservoir simulation. The core notion of this proposed method is to bridge two separate blocks on top of an overall network. They underlie the TgCNN model in a coupled form, which reflects the coupling nature of pressure and water saturation in the two-phase flow equation. The model is driven by not only labeled data but also scientific theories, including governing equations, stochastic parameterization, boundary, and initial conditions, well conditions, and expert knowledge. The results show that the TgCNN-based surrogate model exhibits satisfactory accuracy and efficiency in subsurface oil-water phase flow under multi-well conditions.Keywords: coupled theory-guided convolutional neural network, multi-well conditions, surrogate model, subsurface oil-water phase
Procedia PDF Downloads 86628 Automatic Product Identification Based on Deep-Learning Theory in an Assembly Line
Authors: Fidel Lòpez Saca, Carlos Avilés-Cruz, Miguel Magos-Rivera, José Antonio Lara-Chávez
Abstract:
Automated object recognition and identification systems are widely used throughout the world, particularly in assembly lines, where they perform quality control and automatic part selection tasks. This article presents the design and implementation of an object recognition system in an assembly line. The proposed shapes-color recognition system is based on deep learning theory in a specially designed convolutional network architecture. The used methodology involve stages such as: image capturing, color filtering, location of object mass centers, horizontal and vertical object boundaries, and object clipping. Once the objects are cut out, they are sent to a convolutional neural network, which automatically identifies the type of figure. The identification system works in real-time. The implementation was done on a Raspberry Pi 3 system and on a Jetson-Nano device. The proposal is used in an assembly course of bachelor’s degree in industrial engineering. The results presented include studying the efficiency of the recognition and processing time.Keywords: deep-learning, image classification, image identification, industrial engineering.
Procedia PDF Downloads 160627 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp
Procedia PDF Downloads 346626 Multi-Agent System for Irrigation Using Fuzzy Logic Algorithm and Open Platform Communication Data Access
Authors: T. Wanyama, B. Far
Abstract:
Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.Keywords: community water usage, fuzzy logic, irrigation, multi-agent system
Procedia PDF Downloads 298625 Optimization of Gastro-Retentive Matrix Formulation and Its Gamma Scintigraphic Evaluation
Authors: Swapnila V. Shinde, Hemant P. Joshi, Sumit R. Dhas, Dhananjaysingh B. Rajput
Abstract:
The objective of the present study is to develop hydro-dynamically balanced system for atenolol, β-blocker as a single unit floating tablet. Atenolol shows pH dependent solubility resulting into a bioavailability of 36%. Thus, site specific oral controlled release floating drug delivery system was developed. Formulation includes novice use of rate controlling polymer such as locust bean gum (LBG) in combination of HPMC K4M and gas generating agent sodium bicarbonate. Tablet was prepared by direct compression method and evaluated for physico-mechanical properties. The statistical method was utilized to optimize the effect of independent variables, namely amount of HPMC K4M, LBG and three dependent responses such as cumulative drug release, floating lag time, floating time. Graphical and mathematical analysis of the results allowed the identification and quantification of the formulation variables influencing the selected responses. To study the gastrointestinal transit of the optimized gastro-retentive formulation, in vivo gamma scintigraphy was carried out in six healthy rabbits, after radio labeling the formulation with 99mTc. The transit profiles demonstrated that the dosage form was retained in the stomach for more than 5 hrs. The study signifies the potential of the developed system for stomach targeted delivery of atenolol with improved bioavailability.Keywords: floating tablet, factorial design, gamma scintigraphy, antihypertensive model drug, HPMC, locust bean gum
Procedia PDF Downloads 275624 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework
Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi
Abstract:
There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.Keywords: video lectures, big video data, video retrieval, hadoop
Procedia PDF Downloads 533623 Quantification of Biomethane Potential from Anaerobic Digestion of Food Waste at Vaal University of Technology
Authors: Kgomotso Matobole, Pascal Mwenge, Tumisang Seodigeng
Abstract:
The global urbanisation and worldwide economic growth have caused a high rate of food waste generation, resulting in environmental pollution. Food waste disposed on landfills decomposes to produce methane (CH4), a greenhouse gas. Inadequate waste management practices contribute to food waste polluting the environment. Thus effective organic fraction of municipal solid waste (OFMSW) management and treatment are attracting widespread attention in many countries. This problem can be minimised by the employment of anaerobic digestion process, since food waste is rich in organic matter and highly biodegradable, resulting in energy generation and waste volume reduction. The current study investigated the Biomethane Potential (BMP) of the Vaal University of Technology canteen food waste using anaerobic digestion. Tests were performed on canteen food waste, as a substrate, with total solids (TS) of 22%, volatile solids (VS) of 21% and moisture content of 78%. The tests were performed in batch reactors, at a mesophilic temperature of 37 °C, with two different types of inoculum, primary and digested sludge. The resulting CH4 yields for both food waste with digested sludge and primary sludge were equal, being 357 Nml/g VS. This indicated that food waste form this canteen is rich in organic and highly biodegradable. Hence it can be used as a substrate for the anaerobic digestion process. The food waste with digested sludge and primary sludge both fitted the first order kinetic model with k for primary sludge inoculated food waste being 0.278 day-1 with R2 of 0.98, whereas k for digested sludge inoculated food waste being 0.034 day-1, with R2 of 0.847.Keywords: anaerobic digestion, biogas, bio-methane potential, food waste
Procedia PDF Downloads 235622 Development and Validation of High-Performance Liquid Chromatography Method for the Determination and Pharmacokinetic Study of Linagliptin in Rat Plasma
Authors: Hoda Mahgoub, Abeer Hanafy
Abstract:
Linagliptin (LNG) belongs to dipeptidyl-peptidase-4 (DPP-4) inhibitor class. DPP-4 inhibitors represent a new therapeutic approach for the treatment of type 2 diabetes in adults. The aim of this work was to develop and validate an accurate and reproducible HPLC method for the determination of LNG with high sensitivity in rat plasma. The method involved separation of both LNG and pindolol (internal standard) at ambient temperature on a Zorbax Eclipse XDB C18 column and a mobile phase composed of 75% methanol: 25% formic acid 0.1% pH 4.1 at a flow rate of 1.0 mL.min-1. UV detection was performed at 254nm. The method was validated in compliance with ICH guidelines and found to be linear in the range of 5–1000ng.mL-1. The limit of quantification (LOQ) was found to be 5ng.mL-1 based on 100µL of plasma. The variations for intra- and inter-assay precision were less than 10%, and the accuracy values were ranged between 93.3% and 102.5%. The extraction recovery (R%) was more than 83%. The method involved a single extraction step of a very small plasma volume (100µL). The assay was successfully applied to an in-vivo pharmacokinetic study of LNG in rats that were administered a single oral dose of 10mg.kg-1 LNG. The maximum concentration (Cmax) was found to be 927.5 ± 23.9ng.mL-1. The area under the plasma concentration-time curve (AUC0-72) was 18285.02 ± 605.76h.ng.mL-1. In conclusion, the good accuracy and low LOQ of the bioanalytical HPLC method were suitable for monitoring the full pharmacokinetic profile of LNG in rats. The main advantages of the method were the sensitivity, small sample volume, single-step extraction procedure and the short time of analysis.Keywords: HPLC, linagliptin, pharmacokinetic study, rat plasma
Procedia PDF Downloads 241621 Identification of Indices to Quantify Gentrification
Authors: Sophy Ann Xavier, Lakshmi A
Abstract:
Gentrification is the process of altering a neighborhood's character through the influx of wealthier people and establishments. This idea has subsequently been expanded to encompass brand-new, high-status construction projects that involve regenerating brownfield sites or demolishing and rebuilding residential neighborhoods. Inequality is made worse by Gentrification in ways that go beyond socioeconomic position. The elderly, members of racial and ethnic minorities, individuals with disabilities, and mental health all suffer disproportionately when they are displaced. Cities must cultivate openness, diversity, and inclusion in their collaborations, as well as cooperation on objectives and results. The papers compiled in this issue concentrate on the new gentrification discussions, the rising residential allure of central cities, and the indices to measure this process according to its various varieties. The study makes an effort to fill the research gap in the area of gentrification studies, which is the absence of a set of indices for measuring Gentrification in a specific area. Studies on Gentrification that contain maps of historical change highlight trends that will aid in the production of displacement risk maps, which will guide future interventions by allowing residents and policymakers to extrapolate into the future. Additionally, these maps give locals a glimpse into the future of their communities and serve as a political call to action in areas where residents are expected to be displaced. This study intends to pinpoint metrics and approaches for measuring Gentrification that can then be applied to create a spatiotemporal map of a region and tactics for its inclusive planning. An understanding of various approaches will enable planners and policymakers to select the best approach and create the appropriate plans.Keywords: gentrification, indices, methods, quantification
Procedia PDF Downloads 76620 Intelligent System and Renewable Energy: A Farming Platform in Precision Agriculture
Authors: Ryan B. Escorial, Elmer A. Maravillas, Chris Jordan G. Aliac
Abstract:
This study presents a small-scale water pumping system utilizing a fuzzy logic inference system attached to a renewable energy source. The fuzzy logic controller was designed and simulated in MATLAB fuzzy logic toolbox to examine the properties and characteristics of the input and output variables. The result of the simulation was implemented in a microcontroller, together with sensors, modules, and photovoltaic cells. The study used a grand rapid variety of lettuce, organic substrates, and foliar for observation of the capability of the device to irrigate crops. Two plant boxes intended for manual and automated irrigation were prepared with each box having 48 heads of lettuce. The observation of the system took 22-31 days, which is one harvest period of the crop. Results showed a 22.55% increase in agricultural productivity compared to manual irrigation. Aside from reducing human effort, and time, the smart irrigation system could help lessen some of the shortcomings of manual irrigations. It could facilitate the economical utilization of water, reducing consumption by 25%. The use of renewable energy could also help farmers reduce the cost of production by minimizing the use of diesel and gasoline.Keywords: fuzzy logic, intelligent system, precision agriculture, renewable energy
Procedia PDF Downloads 128619 Air Dispersion Model for Prediction Fugitive Landfill Gaseous Emission Impact in Ambient Atmosphere
Authors: Moustafa Osman Mohammed
Abstract:
This paper will explore formation of HCl aerosol at atmospheric boundary layers and encourages the uptake of environmental modeling systems (EMSs) as a practice evaluation of gaseous emissions (“framework measures”) from small and medium-sized enterprises (SMEs). The conceptual model predicts greenhouse gas emissions to ecological points beyond landfill site operations. It focuses on incorporation traditional knowledge into baseline information for both measurement data and the mathematical results, regarding parameters influence model variable inputs. The paper has simplified parameters of aerosol processes based on the more complex aerosol process computations. The simple model can be implemented to both Gaussian and Eulerian rural dispersion models. Aerosol processes considered in this study were (i) the coagulation of particles, (ii) the condensation and evaporation of organic vapors, and (iii) dry deposition. The chemical transformation of gas-phase compounds is taken into account photochemical formulation with exposure effects according to HCl concentrations as starting point of risk assessment. The discussion set out distinctly aspect of sustainability in reflection inputs, outputs, and modes of impact on the environment. Thereby, models incorporate abiotic and biotic species to broaden the scope of integration for both quantification impact and assessment risks. The later environmental obligations suggest either a recommendation or a decision of what is a legislative should be achieved for mitigation measures of landfill gas (LFG) ultimately.Keywords: air pollution, landfill emission, environmental management, monitoring/methods and impact assessment
Procedia PDF Downloads 323618 Development and Automation of Medium-Scale NFT Hydroponic Systems: Design Methodology and State of the Art Review
Authors: Oscar Armando González-Marin, Jhon F. Rodríguez-León, Oscar Mota-Pérez, Jorge Pineda-Piñón, Roberto S. Velázquez-González., Julio C. Sosa-Savedra
Abstract:
Over the past six years, the World Meteorological Organization (WMO) has recorded the warmest years since 1880, primarily attributed to climate change. In addition, the overexploitation of agricultural lands, combined with food and water scarcity, has highlighted the urgent need for sustainable cultivation methods. Hydroponics has emerged as a sustainable farming technique that enables plant cultivation using nutrient solutions without the requirement for traditional soil. Among hydroponic methods, the Nutrient Film Technique (NFT) facilitates plant growth by circulating a nutrient solution continuously. This approach allows the monitoring and precise control of nutritional parameters, with potential for automation and technological integration. This study aims to present the state of the art of automated NFT hydroponic systems, discussing their design methodologies and considerations for implementation. Moreover, a medium-scale NFT system developed at CICATA-QRO is introduced, detailing its current manual management and progress toward automation.Keywords: automation, hydroponics, nutrient film technique, sustainability
Procedia PDF Downloads 39617 Development of K-Factor for Road Geometric Design: A Case Study of North Coast Road in Java
Authors: Edwin Hidayat, Redi Yulianto, Disi Hanafiah
Abstract:
On the one hand, parameters which are used for determining the number of lane on the new road construction are average annual average daily traffic (AADT) and peak hour factor (K-factor). On the other hand, the value of K-factor listed in the guidelines and manual for road planning in Indonesia is a value of adoption or adaptation from foreign guidelines or manuals. Thus, the value is less suitable for Indonesian condition due to differences in road conditions, vehicle type, and driving behavior. The purpose of this study is to provide an example on how to determine k-factor values at a road segment with particular conditions in north coast road, West Java. The methodology is started with collecting traffic volume data for 24 hours over 365 days using PLATO (Automated Traffic Counter) with the approach of video image processing. Then, the traffic volume data is divided into per hour and analyzed by comparing the peak traffic volume in the 30th hour (or other) with the AADT in the same year. The analysis has resulted that for the 30th peak hour the K-factor is 0.97. This value can be used for planning road geometry or evaluating the road capacity performance for the 4/2D interurban road.Keywords: road geometry, K-factor, annual average daily traffic, north coast road
Procedia PDF Downloads 161616 Validation of Visibility Data from Road Weather Information Systems by Comparing Three Data Resources: Case Study in Ohio
Authors: Fan Ye
Abstract:
Adverse weather conditions, particularly those with low visibility, are critical to the driving tasks. However, the direct relationship between visibility distances and traffic flow/roadway safety is uncertain due to the limitation of visibility data availability. The recent growth of deployment of Road Weather Information Systems (RWIS) makes segment-specific visibility information available which can be integrated with other Intelligent Transportation System, such as automated warning system and variable speed limit, to improve mobility and safety. Before applying the RWIS visibility measurements in traffic study and operations, it is critical to validate the data. Therefore, an attempt was made in the paper to examine the validity and viability of RWIS visibility data by comparing visibility measurements among RWIS, airport weather stations, and weather information recorded by police in crash reports, based on Ohio data. The results indicated that RWIS visibility measurements were significantly different from airport visibility data in Ohio, but no conclusion regarding the reliability of RWIS visibility could be drawn in the consideration of no verified ground truth in the comparisons. It was suggested that more objective methods are needed to validate the RWIS visibility measurements, such as continuous in-field measurements associated with various weather events using calibrated visibility sensors.Keywords: RWIS, visibility distance, low visibility, adverse weather
Procedia PDF Downloads 249615 Machine Learning Approach for Mutation Testing
Authors: Michael Stewart
Abstract:
Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing
Procedia PDF Downloads 198614 Tibyan Automated Arabic Correction Using Machine-Learning in Detecting Syntactical Mistakes
Authors: Ashwag O. Maghraby, Nida N. Khan, Hosnia A. Ahmed, Ghufran N. Brohi, Hind F. Assouli, Jawaher S. Melibari
Abstract:
The Arabic language is one of the most important languages. Learning it is so important for many people around the world because of its religious and economic importance and the real challenge lies in practicing it without grammatical or syntactical mistakes. This research focused on detecting and correcting the syntactic mistakes of Arabic syntax according to their position in the sentence and focused on two of the main syntactical rules in Arabic: Dual and Plural. It analyzes each sentence in the text, using Stanford CoreNLP morphological analyzer and machine-learning approach in order to detect the syntactical mistakes and then correct it. A prototype of the proposed system was implemented and evaluated. It uses support vector machine (SVM) algorithm to detect Arabic grammatical errors and correct them using the rule-based approach. The prototype system has a far accuracy 81%. In general, it shows a set of useful grammatical suggestions that the user may forget about while writing due to lack of familiarity with grammar or as a result of the speed of writing such as alerting the user when using a plural term to indicate one person.Keywords: Arabic language acquisition and learning, natural language processing, morphological analyzer, part-of-speech
Procedia PDF Downloads 152613 CSRFDtool: Automated Detection and Prevention of a Reflected Cross-Site Request Forgery
Authors: Alaa A. Almarzuki, Nora A. Farraj, Aisha M. Alshiky, Omar A. Batarfi
Abstract:
The number of internet users is dramatically increased every year. Most of these users are exposed to the dangers of attackers in one way or another. The reason for this lies in the presence of many weaknesses that are not known for native users. In addition, the lack of user awareness is considered as the main reason for falling into the attackers’ snares. Cross Site Request Forgery (CSRF) has placed in the list of the most dangerous threats to security in OWASP Top Ten for 2013. CSRF is an attack that forces the user’s browser to send or perform unwanted request or action without user awareness by exploiting a valid session between the browser and the server. When CSRF attack successes, it leads to many bad consequences. An attacker may reach private and personal information and modify it. This paper aims to detect and prevent a specific type of CSRF, called reflected CSRF. In a reflected CSRF, a malicious code could be injected by the attackers. This paper explores how CSRF Detection Extension prevents the reflected CSRF by checking browser specific information. Our evaluation shows that the proposed solution succeeds in preventing this type of attack.Keywords: CSRF, CSRF detection extension, attackers, attacks
Procedia PDF Downloads 414612 A Validated High-Performance Liquid Chromatography-UV Method for Determination of Malondialdehyde-Application to Study in Chronic Ciprofloxacin Treated Rats
Authors: Anil P. Dewani, Ravindra L. Bakal, Anil V. Chandewar
Abstract:
Present work demonstrates the applicability of high-performance liquid chromatography (HPLC) with UV detection for the determination of malondialdehyde as malondialdehyde-thiobarbituric acid complex (MDA-TBA) in-vivo in rats. The HPLC-UV method for MDA-TBA was achieved by isocratic mode on a reverse-phase C18 column (250mm×4.6mm) at a flow rate of 1.0mLmin−1 followed by UV detection at 278 nm. The chromatographic conditions were optimized by varying the concentration and pH followed by changes in percentage of organic phase optimal mobile phase consisted of mixture of water (0.2% Triethylamine pH adjusted to 2.3 by ortho-phosphoric acid) and acetonitrile in ratio (80:20 % v/v). The retention time of MDA-TBA complex was 3.7 min. The developed method was sensitive as limit of detection and quantification (LOD and LOQ) for MDA-TBA complex were (standard deviation and slope of calibration curve) 110 ng/ml and 363 ng/ml respectively. The method was linear for MDA spiked in plasma and subjected to derivatization at concentrations ranging from 100 to 1000 ng/ml. The precision of developed method measured in terms of relative standard deviations for intra-day and inter-day studies was 1.6–5.0% and 1.9–3.6% respectively. The HPLC method was applied for monitoring MDA levels in rats subjected to chronic treatment of ciprofloxacin (CFL) (5mg/kg/day) for 21 days. Results were compared by findings in control group rats. Mean peak areas of both study groups was subjected for statistical treatment to unpaired student t-test to find p-values. The p value was < 0.001 indicating significant results and suggesting increased MDA levels in rats subjected to chronic treatment of CFL of 21 days.Keywords: MDA, TBA, ciprofloxacin, HPLC-UV
Procedia PDF Downloads 325611 A Research Using Remote Monitoring Technology for Pump Output Monitoring in Distributed Fuel Stations in Nigeria
Authors: Ofoegbu Ositadinma Edward
Abstract:
This research paper discusses a web based monitoring system that enables effective monitoring of fuel pump output and sales volume from distributed fuel stations under the domain of a single company/organization. The traditional method of operation by these organizations in Nigeria is non-automated and accounting for dispensed product is usually approximated and manual as there is little or no technology implemented to presently provide information relating to the state of affairs in the station both to on-ground staff and to supervisory staff that are not physically present in the station. This results in unaccountable losses in product and revenue as well as slow decision making. Remote monitoring technology as a vast research field with numerous application areas incorporating various data collation techniques and sensor networks can be applied to provide information relating to fuel pump status in distributed fuel stations reliably. Thus, the proposed system relies upon a microcontroller, keypad and pump to demonstrate the traditional fuel dispenser. A web-enabled PC with an accompanying graphic user interface (GUI) was designed using virtual basic which is connected to the microcontroller via the serial port which is to provide the web implementation.Keywords: fuel pump, microcontroller, GUI, web
Procedia PDF Downloads 433610 Infectivity of Hyalomma Ticks for Theileria annulata Using 18s rRNA PCR
Authors: Muhammad S. Sajid, A. Iqbal, A. Kausar, M. Jawad-ul-Hassan, Z. Iqbal, Hafiz M. Rizwan, M. Saqib
Abstract:
Among the ixodid ticks, species of genus Hyalomma are of prime importance as they can survive in harsh conditions better than those of other species. Similarly, among various tick-borne pathogens, Theileria (T.) annulata, the causative agent of tropical theileriosis in large ruminants, is responsible for reduced productivity and ultimately substantial economic losses due to morbidity and mortality. The present study was planned to screening of vector ticks through molecular techniques for determination of tick-borne theileriosis in district Toba Tek Singh (T. T. Singh), Punjab, Pakistan. For this purpose, among the collected ticks (n = 2252) from livestock and their microclimate, Hyalomma spp. were subjected to dissection for procurement of salivary glands (SGs) and formation of pool (averaged 8 acini in each pool). Each pool of acini was used for DNA extraction, quantification and primer-specific amplification of 18S rRNA of Theileria (T.) annulata. The amplicons were electrophoresed using 1.8% agarose gel following by imaging to identify the band specific for T. annulata. For confirmation, the positive amplicons were subjected to sequencing, BLAST analysis and homology search using NCBI software. The number of Theileria-infected acini was significantly higher (P < 0.05) in female ticks vs male ticks, infesting ticks vs questing ticks and riverine-collected vs non-riverine collected. The data provides first attempt to quantify the vectoral capacity of ixodid ticks in Pakistan for T. annulata which can be helpful in estimation of risk analysis of theileriosis to the domestic livestock population of the country.Keywords: Hyalomma anatolicum, ixodids, PCR, Theileria annulata
Procedia PDF Downloads 288609 Behavioral and Electroantennographic Responses of the Tea Shot Hole Borer, Euwallacea fornicatus, Eichhoff (Scolytidae: Coleoptera) to Volatiles Compounds of Montanoa bipinnatifida (Compositae: Asteraceae) and Development of a Kairomone Trap
Authors: Sachin Paul James, Selvasundaram Rajagopal, Muraleedharan Nair, Babu Azariah
Abstract:
The shot hole borer (SHB), Euwallacea fornicatus (= Xyleborus fornicatus) (Scolytidae: Coleoptera) is one of the major pests of tea in southern India and Sri Lanka. The partially dried cut stem of a jungle plant, Montanoa bipinnatifida (C.Koch) (Compositae: Asteraceae) reported to attract shot hole borer beetles in the field. Collection, isolation, identification and quantification of the emitted volatiles from the partially dried cut stems of M. bipinnatifida using dynamic head space and GC-MS revealed the presence of seven compounds viz. α- pinene, β- phellandrene, β - pinene, D- limonene, trans-caryophyllene, iso- caryophyllene and germacrene– D. Behavioural bioassays using electroantennogram (EAG) and wind tunnel proved that, among these identified compounds only α - pinene, trans-caryophyllene, β – phellandrene and germacrene-D evoked significant behavioral response and maximum response was obtained to a specific blend of these four compounds @ 10:1:0.1:3. Field trapping experiments of this blend conducted in the SHB infested field using multiple funnel traps further proved the efficiency of the blend with a mean trap catch of 176.7 ± 13.1 beetles. Mass trapping studies in the field helped to develop a kairomone trap for the management of SHB in the tea fields of southern India.Keywords: electroantennogram, kairomone trap, Montanoa bipinnatifida, tea shot hole borer
Procedia PDF Downloads 223608 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification
Procedia PDF Downloads 277607 Proposal of Blue and Green Infrastructure for the Jaguaré Stream Watershed, São Paulo, Brazil
Authors: Juliana C. Alencar, Monica Ferreira do Amaral Porto
Abstract:
The blue-green infrastructure in recent years has been pointed out as a possibility to increase the environmental quality of watersheds. The regulation ecosystem services brought by these areas are many, such as the improvement of the air quality of the air, water, soil, microclimate, besides helping to control the peak flows and to promote the quality of life of the population. This study proposes a blue-green infrastructure scenario for the Jaguaré watershed, located in the western zone of the São Paulo city in Brazil. Based on the proposed scenario, it was verified the impact of the adoption of the blue and green infrastructure in the control of the peak flow of the basin, the benefits for the avifauna that are also reflected in the flora and finally, the quantification of the regulation ecosystem services brought by the adoption of the scenario proposed. A survey of existing green areas and potential areas for expansion and connection of these areas to form a network in the watershed was carried out. Based on this proposed new network of green areas, the peak flow for the proposed scenario was calculated with the help of software, ABC6. Finally, a survey of the ecosystem services contemplated in the proposed scenario was made. It was possible to conclude that the blue and green infrastructure would provide several regulation ecosystem services for the watershed, such as the control of the peak flow, the connection frame between the forest fragments that promoted the environmental enrichment of these fragments, improvement of the microclimate and the provision of leisure areas for the population.Keywords: green and blue infrastructure, sustainable drainage, urban waters, ecosystem services
Procedia PDF Downloads 117606 Micro-Ribonucleic Acid-21 as High Potential Prostate Cancer Biomarker
Authors: Regina R. Gunawan, Indwiani Astuti, H. Raden Danarto
Abstract:
Cancer is the leading cause of death worldwide. Cancer is caused by mutations that alter the function of normal human genes and give rise to cancer genes. MicroRNA (miRNA) is a small non-coding RNA that regulates the gen through complementary bond towards mRNA target and cause mRNA degradation. miRNA works by either promoting or suppressing cell proliferation. miRNA level expression in cancer may offer another value of miRNA as a biomarker in cancer diagnostic. miRNA-21 is believed to have a role in carcinogenesis by enhancing proliferation, anti-apoptosis, cell cycle progression and invasion of tumor cells. Hsa-miR-21-5p marker has been identified in Prostate Cancer (PCa) and Benign Prostatic Hyperplasia (BPH) patient’s urine. This research planned to explore the diagnostic performance of miR-21 to differentiate PCa and BPH patients. In this study, urine samples were collected from 20 PCa patients and 20 BPH patients. miR-21 relative expression against the reference gene was analyzed and compared between the two. miRNA expression was analyzed using the comparative quantification method to find the fold change. miR-21 validity in identifying PCa patients was performed by quantifying the sensitivity and specificity with the contingency table. miR-21 relative expression against miR-16 in PCa patient and in BPH patient has 12,98 differences in fold change. From a contingency table of Cq expression of miR-21 in identifying PCa patients from BPH patient, Cq miR-21 has 100% sensitivity and 75% specificity. miR-21 relative expression can be used in discriminating PCa from BPH by using a urine sample. Furthermore, the expression of miR-21 has higher sensitivity compared to PSA (Prostate specific antigen), therefore miR-21 has a high potential to be analyzed and developed more.Keywords: benign prostate hyperplasia, biomarker, miRNA-21, prostate cancer
Procedia PDF Downloads 159605 Quantification of Effect of Linear Anionic Polyacrylamide on Seepage in Irrigation Channels
Authors: Hamil Uribe, Cristian Arancibia
Abstract:
In Chile, the water for irrigation and hydropower generation is delivery essentially through unlined channels on earth, which have high seepage losses. Traditional seepage-abatement technologies are very expensive. The goals of this work were to quantify water loss in unlined channels and select reaches to evaluate the use of linear anionic polyacrylamide (LA-PAM) to reduce seepage losses. The study was carried out in Maule Region, central area of Chile. Water users indicated reaches with potential seepage losses, 45 km of channels in total, whose flow varied between 1.07 and 23.6 m³ s⁻¹. According to seepage measurements, 4 reaches of channels, 4.5 km in total, were selected for LA-PAM application. One to 4 LA-PAM applications were performed at rates of 11 kg ha⁻¹, considering wet perimeter area as basis of calculation. Large channels were used to allow motorboat moving against the current to carry-out LA-PAM application. For applications, a seeder machine was used to evenly distribute granulated polymer on water surface. Water flow was measured (StreamPro ADCP) upstream and downstream in selected reaches, to estimate seepage losses before and after LA-PAM application. Weekly measurements were made to quantify treatment effect and duration. In each case, water turbidity and temperature were measured. Channels showed variable losses up to 13.5%. Channels showing water gains were not treated with PAM. In all cases, LA-PAM effect was positive, achieving average loss reductions of 8% to 3.1%. Water loss was confirmed and it was possible to reduce seepage through LA-PAM applications provided that losses were known and correctly determined when applying the polymer. This could allow increasing irrigation security in critical periods, especially under drought conditions.Keywords: canal seepage, irrigation, polyacrylamide, water management
Procedia PDF Downloads 174604 On the Added Value of Probabilistic Forecasts Applied to the Optimal Scheduling of a PV Power Plant with Batteries in French Guiana
Authors: Rafael Alvarenga, Hubert Herbaux, Laurent Linguet
Abstract:
The uncertainty concerning the power production of intermittent renewable energy is one of the main barriers to the integration of such assets into the power grid. Efforts have thus been made to develop methods to quantify this uncertainty, allowing producers to ensure more reliable and profitable engagements related to their future power delivery. Even though a diversity of probabilistic approaches was proposed in the literature giving promising results, the added value of adopting such methods for scheduling intermittent power plants is still unclear. In this study, the profits obtained by a decision-making model used to optimally schedule an existing PV power plant connected to batteries are compared when the model is fed with deterministic and probabilistic forecasts generated with two of the most recent methods proposed in the literature. Moreover, deterministic forecasts with different accuracy levels were used in the experiments, testing the utility and the capability of probabilistic methods of modeling the progressively increasing uncertainty. Even though probabilistic approaches are unquestionably developed in the recent literature, the results obtained through a study case show that deterministic forecasts still provide the best performance if accurate, ensuring a gain of 14% on final profits compared to the average performance of probabilistic models conditioned to the same forecasts. When the accuracy of deterministic forecasts progressively decreases, probabilistic approaches start to become competitive options until they completely outperform deterministic forecasts when these are very inaccurate, generating 73% more profits in the case considered compared to the deterministic approach.Keywords: PV power forecasting, uncertainty quantification, optimal scheduling, power systems
Procedia PDF Downloads 87