Search results for: inviscid regularization technique
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6633

Search results for: inviscid regularization technique

5223 Obtaining of Nanocrystalline Ferrites and Other Complex Oxides by Sol-Gel Method with Participation of Auto-Combustion

Authors: V. S. Bushkova

Abstract:

It is well known that in recent years magnetic materials have received increased attention due to their properties. For this reason a significant number of patents that were published during the last decade are oriented towards synthesis and study of such materials. The aim of this work is to create and study ferrite nanocrystalline materials with spinel structure, using sol-gel technology with participation of auto-combustion. This method is perspective in that it is a cheap and low-temperature technique that allows for the fine control on the product’s chemical composition.

Keywords: magnetic materials, ferrites, sol-gel technology, nanocrystalline powders

Procedia PDF Downloads 409
5222 Gradient Boosted Trees on Spark Platform for Supervised Learning in Health Care Big Data

Authors: Gayathri Nagarajan, L. D. Dhinesh Babu

Abstract:

Health care is one of the prominent industries that generate voluminous data thereby finding the need of machine learning techniques with big data solutions for efficient processing and prediction. Missing data, incomplete data, real time streaming data, sensitive data, privacy, heterogeneity are few of the common challenges to be addressed for efficient processing and mining of health care data. In comparison with other applications, accuracy and fast processing are of higher importance for health care applications as they are related to the human life directly. Though there are many machine learning techniques and big data solutions used for efficient processing and prediction in health care data, different techniques and different frameworks are proved to be effective for different applications largely depending on the characteristics of the datasets. In this paper, we present a framework that uses ensemble machine learning technique gradient boosted trees for data classification in health care big data. The framework is built on Spark platform which is fast in comparison with other traditional frameworks. Unlike other works that focus on a single technique, our work presents a comparison of six different machine learning techniques along with gradient boosted trees on datasets of different characteristics. Five benchmark health care datasets are considered for experimentation, and the results of different machine learning techniques are discussed in comparison with gradient boosted trees. The metric chosen for comparison is misclassification error rate and the run time of the algorithms. The goal of this paper is to i) Compare the performance of gradient boosted trees with other machine learning techniques in Spark platform specifically for health care big data and ii) Discuss the results from the experiments conducted on datasets of different characteristics thereby drawing inference and conclusion. The experimental results show that the accuracy is largely dependent on the characteristics of the datasets for other machine learning techniques whereas gradient boosting trees yields reasonably stable results in terms of accuracy without largely depending on the dataset characteristics.

Keywords: big data analytics, ensemble machine learning, gradient boosted trees, Spark platform

Procedia PDF Downloads 240
5221 Voluntary Work Monetary Value and Cost-Benefit Analysis with 'Value Audit and Voluntary Investment' Technique: Case Study of Yazd Red Crescent Society Youth Members Voluntary Work in Health and Safety Plan for New Year's Passengers

Authors: Hamed Seddighi Khavidak

Abstract:

Voluntary work has a lot of economic and social benefits for a country, but the economic value is ignored because it is voluntary. The aim of this study is reviewing Monetary Value of Voluntary Work methods and comparing opportunity cost method and replacement cost method both in theory and in practice. Beside monetary value, in this study, we discuss cost-benefit analysis of health and safety plan in the New Year that conducted by young volunteers of Red Crescent society of Iran. Method: We discussed eight methods for monetary value of voluntary work including: Alternative-Employment Wage Approach, Leisure-Adjusted OCA, Volunteer Judgment OCA, Replacement Wage Approach, Volunteer Judgment RWA, Supervisor Judgment RWA, Cost of Counterpart Goods and Services and Beneficiary Judgment. Also, for cost benefit analysis we drew on 'value audit and volunteer investment' (VIVA) technique that is used widely in voluntary organizations like international federation of Red Cross and Red Crescent societies. Findings: In this study, using replacement cost approach, voluntary work by 1034 youth volunteers was valued 938000000 Riyals and using Replacement Wage Approach it was valued 2268713232 Riyals. Moreover, Yazd Red Crescent Society spent 212800000 Riyals on food and other costs for these volunteers. Discussion and conclusion: In this study, using cost benefit analysis method that is Volunteer Investment and Value Audit (VIVA), VIVA rate showed that for every Riyal that the Red Crescent Society invested in the health and safety of New Year's travelers in its volunteer project, four Riyals returned, and using the wage replacement approach, 11 Riyals returned. Therefore, New Year's travelers health and safety project were successful and economically, it was worthwhile for the Red Crescent Society because the output was much bigger than the input costs.

Keywords: voluntary work, monetary value, youth, red crescent society

Procedia PDF Downloads 216
5220 Application of Optical Method for Calcul of Deformed Object Samples

Authors: R. Daira

Abstract:

The electronic speckle interferometry technique used to measure the deformations of scatterers process is based on the subtraction of interference patterns. A speckle image is first recorded before deformation of the object in the RAM of a computer, after a second deflection. The square of the difference between two images showing correlation fringes observable in real time directly on monitor. The interpretation these fringes to determine the deformation. In this paper, we present experimental results of deformation out of the plane of two samples in aluminum, electronic boards and stainless steel.

Keywords: optical method, holography, interferometry, deformation

Procedia PDF Downloads 404
5219 Linking Business Owners’ Choice of Organizational Form to Appraisers’ Determination of Value: An Agency Theory Perspective

Authors: Majdi Anwar Quttainah, William Paczkowski, Ali Muhammad

Abstract:

Determining the value of a privately held firms confound those in academia as well as practitioners in the fields of appraisal, forensic accounting, and law. Divergent parties to the transfer look to apply the valuation technique to serve their own best interests. This paper seeks to explore how agency theory induces owners to choose the form of their businesses at inception and how this choice will affect the appraisers’ valuation of the firm at the transfer of ownership.

Keywords: organizational form, agency theory, value

Procedia PDF Downloads 430
5218 An Approximation Technique to Automate Tron

Authors: P. Jayashree, S. Rajkumar

Abstract:

With the trend of virtual and augmented reality environments booming to provide a life like experience, gaming is a major tool in supporting such learning environments. In this work, a variant of Voronoi heuristics, employing supervised learning for the TRON game is proposed. The paper discusses the features that would be really useful when a machine learning bot is to be used as an opponent against a human player. Various game scenarios, nature of the bot and the experimental results are provided for the proposed variant to prove that the approach is better than those that are currently followed.

Keywords: artificial Intelligence, automation, machine learning, TRON game, Voronoi heuristics

Procedia PDF Downloads 466
5217 3D-Mesh Robust Watermarking Technique for Ownership Protection and Authentication

Authors: Farhan A. Alenizi

Abstract:

Digital watermarking has evolved in the past years as an important means for data authentication and ownership protection. The images and video watermarking was well known in the field of multimedia processing; however, 3D objects' watermarking techniques have emerged as an important means for the same purposes, as 3D mesh models are in increasing use in different areas of scientific, industrial, and medical applications. Like the image watermarking techniques, 3D watermarking can take place in either space or transform domains. Unlike images and video watermarking, where the frames have regular structures in both space and temporal domains, 3D objects are represented in different ways as meshes that are basically irregular samplings of surfaces; moreover, meshes can undergo a large variety of alterations which may be hard to tackle. This makes the watermarking process more challenging. While the transform domain watermarking is preferable in images and videos, they are still difficult to implement in 3d meshes due to the huge number of vertices involved and the complicated topology and geometry, and hence the difficulty to perform the spectral decomposition, even though significant work was done in the field. Spatial domain watermarking has attracted significant attention in the past years; they can either act on the topology or on the geometry of the model. Exploiting the statistical characteristics in the 3D mesh models from both geometrical and topological aspects was useful in hiding data. However, doing that with minimal surface distortions to the mesh attracted significant research in the field. A 3D mesh blind watermarking technique is proposed in this research. The watermarking method depends on modifying the vertices' positions with respect to the center of the object. An optimal method will be developed to reduce the errors, minimizing the distortions that the 3d object may experience due to the watermarking process, and reducing the computational complexity due to the iterations and other factors. The technique relies on the displacement process of the vertices' locations depending on the modification of the variances of the vertices’ norms. Statistical analyses were performed to establish the proper distributions that best fit each mesh, and hence establishing the bins sizes. Several optimizing approaches were introduced in the realms of mesh local roughness, the statistical distributions of the norms, and the displacements in the mesh centers. To evaluate the algorithm's robustness against other common geometry and connectivity attacks, the watermarked objects were subjected to uniform noise, Laplacian smoothing, vertices quantization, simplification, and cropping. Experimental results showed that the approach is robust in terms of both perceptual and quantitative qualities. It was also robust against both geometry and connectivity attacks. Moreover, the probability of true positive detection versus the probability of false-positive detection was evaluated. To validate the accuracy of the test cases, the receiver operating characteristics (ROC) curves were drawn, and they’ve shown robustness from this aspect. 3D watermarking is still a new field but still a promising one.

Keywords: watermarking, mesh objects, local roughness, Laplacian Smoothing

Procedia PDF Downloads 160
5216 Prevalence and Antibiotic Resistance of Bacteria Isolated from Farmers’ Market Fruits and Vegetables Collected from Frostburg and Cumberland Areas in Maryland

Authors: Kumudini Apsara Munasinghe, Devin Gregory Lissau, Ryan Thomas Wade

Abstract:

Fresh fruits and vegetables are rich in vitamins, minerals, and fibers and help maintain a healthy weight over high-calorie food. Eating fruits and vegetables protects us from free radicals produced by metabolic reactions and safeguards us from cardiovascular disease and cancer. However, there has been an increased concern about foodborne diseases tied to contaminated farmers’ market produce. In addition, very little information is available about the contribution of eating raw fruits and vegetables to human exposure to antibiotic-resistant bacteria. This research aims to identify bacteria isolated from farmers’ market fruits and vegetables and understand their antibiotic resistance. Vegetables and fruits were collected from farmers’ markets around Frostburg and Cumberland areas in Maryland and transported to the microbiology lab at Frostburg State University for the isolation of bacteria. Bacteria were extracted from tomatoes, cucumber, strawberry, and lettuce using Tryptic soy broth overnight at 37°C, and Tryptic Soy agar was used for the streak plate technique to isolate bacteria. Pure cultures were used to identify bacteria using biochemical reactions after conducting Gram staining technique. The research used many biochemical reactions, including Mannitol Salt agar, MacConkey agar, and Eosin Methylene blue agar, for identification. Antibiotic sensitivity was tested for many different types of antibiotics, including amoxicillin, penicillin, tetracycline, ampicillin, and erythromycin. Most prevalent bacteria in the isolates were Staphylococcus, Bacillus, Micrococcus, Enterococcus, Enterobacter, Citrobacter, and other bacteria from the family Enterobacteriaceae. The data obtained from this research will be useful to educate and train farmers and individuals involved in post-harvest processes such as transportation and selling in farmers’ markets. Further results for bacterial antibiotic resistance will be obtained, and unculturable bacteria will be identified by next-generation DNA sequencing.

Keywords: antibiotic resistance, farmers markets, fruits, bacteria, vegetables

Procedia PDF Downloads 68
5215 Early and Mid-Term Results of Anesthetic Management of Minimal Invasive Coronary Artery Bypass Grafting Using One Lung Ventilation

Authors: Devendra Gupta, S. P. Ambesh, P. K Singh

Abstract:

Introduction: Minimally invasive coronary artery bypass grafting (MICABG) is a less invasive method of performing surgical revascularization. Minimally invasive direct coronary artery bypass (MIDCAB) provides many anesthetic challenges including one lung ventilation (OLV), managing myocardial ischemia, and pain. We present an early and midterm result of the use of this technique with OLV. Method: We enrolled 62 patients for analysis operated between 2008 and 2012. Patients were anesthetized and left endobronchial tube was placed. During the procedure left lung was isolated and one lung ventilation was maintained through right lung. Operation was performed utilizing off pump technique of coronary artery bypass grafting through a minimal invasive incision. Left internal mammary artery graft was done for single vessel disease and radial artery was utilized for other grafts if required. Postoperative ventilation was done with single lumen endotracheal tube. Median follow-up is 2.5 years (6 months to 4 years). Results: Median age was 58.5 years (41-77) and all were male. Single vessel disease was present in 36, double vessel in 24 and triple vessel disease in 2 patients. All the patients had normal left ventricular size and function. In 2 cases difficulty were encounter in placement of endobronchial tube. In 1 case cuff of endobronchial tube was ruptured during intubation. High airway pressure was developed on OLV in 1 case and surgery was accomplished with two lung anesthesia with low tidal volume. Mean postoperative ventilation time was 14.4 hour (11-22). There was no perioperative and 30 day mortality. Conversion to median sternotomy to complete the operation was done in 3.23% (2 out of 62 patients). One patient had acute myocardial infarction postoperatively and there were no deaths during follow-up. Conclusion: MICABG is a safe and effective method of revascularization with OLV in low risk candidates for coronary artery bypass grafting.

Keywords: MIDCABG, one lung ventilation, coronary artery bypass grafting, endobronchial tube

Procedia PDF Downloads 425
5214 Micro-Filtration with an Inorganic Membrane

Authors: Benyamina, Ouldabess, Bensalah

Abstract:

The aim of this study is to use membrane technique for filtration of a coloring solution. the preparation of the micro-filtration membranes is based on a natural clay powder with a low cost, deposited on macro-porous ceramic supports. The micro-filtration membrane provided a very large permeation flow. Indeed, the filtration effectiveness of membrane was proved by the total discoloration of bromothymol blue solution with initial concentration of 10-3 mg/L after the first minutes.

Keywords: the inorganic membrane, micro-filtration, coloring solution, natural clay powder

Procedia PDF Downloads 513
5213 Development of Standard Evaluation Technique for Car Carpet Floor

Authors: In-Sung Lee, Un-Hwan Park, Jun-Hyeok Heo, Tae-Hyeon Oh, Dae-Gyu Park

Abstract:

Statistical Energy Analysis is to be the most effective CAE Method for air-born noise analysis in the Automotive area. This study deals with a method to predict the noise level inside of the car under the steady-state condition using the SEA model of car for air-born noise analysis. We can identify weakened part due to the acoustic material properties using it. Therefore, it is useful for the material structural design.

Keywords: air-born noise, material structural design, acoustic material properties, absorbing

Procedia PDF Downloads 422
5212 New Machine Learning Optimization Approach Based on Input Variables Disposition Applied for Time Series Prediction

Authors: Hervice Roméo Fogno Fotsoa, Germaine Djuidje Kenmoe, Claude Vidal Aloyem Kazé

Abstract:

One of the main applications of machine learning is the prediction of time series. But a more accurate prediction requires a more optimal model of machine learning. Several optimization techniques have been developed, but without considering the input variables disposition of the system. Thus, this work aims to present a new machine learning architecture optimization technique based on their optimal input variables disposition. The validations are done on the prediction of wind time series, using data collected in Cameroon. The number of possible dispositions with four input variables is determined, i.e., twenty-four. Each of the dispositions is used to perform the prediction, with the main criteria being the training and prediction performances. The results obtained from a static architecture and a dynamic architecture of neural networks have shown that these performances are a function of the input variable's disposition, and this is in a different way from the architectures. This analysis revealed that it is necessary to take into account the input variable's disposition for the development of a more optimal neural network model. Thus, a new neural network training algorithm is proposed by introducing the search for the optimal input variables disposition in the traditional back-propagation algorithm. The results of the application of this new optimization approach on the two single neural network architectures are compared with the previously obtained results step by step. Moreover, this proposed approach is validated in a collaborative optimization method with a single objective optimization technique, i.e., genetic algorithm back-propagation neural networks. From these comparisons, it is concluded that each proposed model outperforms its traditional model in terms of training and prediction performance of time series. Thus the proposed optimization approach can be useful in improving the accuracy of time series forecasts. This proves that the proposed optimization approach can be useful in improving the accuracy of time series prediction based on machine learning.

Keywords: input variable disposition, machine learning, optimization, performance, time series prediction

Procedia PDF Downloads 109
5211 A Comparative Assessment of Information Value, Fuzzy Expert System Models for Landslide Susceptibility Mapping of Dharamshala and Surrounding, Himachal Pradesh, India

Authors: Kumari Sweta, Ajanta Goswami, Abhilasha Dixit

Abstract:

Landslide is a geomorphic process that plays an essential role in the evolution of the hill-slope and long-term landscape evolution. But its abrupt nature and the associated catastrophic forces of the process can have undesirable socio-economic impacts, like substantial economic losses, fatalities, ecosystem, geomorphologic and infrastructure disturbances. The estimated fatality rate is approximately 1person /100 sq. Km and the average economic loss is more than 550 crores/year in the Himalayan belt due to landslides. This study presents a comparative performance of a statistical bivariate method and a machine learning technique for landslide susceptibility mapping in and around Dharamshala, Himachal Pradesh. The final produced landslide susceptibility maps (LSMs) with better accuracy could be used for land-use planning to prevent future losses. Dharamshala, a part of North-western Himalaya, is one of the fastest-growing tourism hubs with a total population of 30,764 according to the 2011 census and is amongst one of the hundred Indian cities to be developed as a smart city under PM’s Smart Cities Mission. A total of 209 landslide locations were identified in using high-resolution linear imaging self-scanning (LISS IV) data. The thematic maps of parameters influencing landslide occurrence were generated using remote sensing and other ancillary data in the GIS environment. The landslide causative parameters used in the study are slope angle, slope aspect, elevation, curvature, topographic wetness index, relative relief, distance from lineaments, land use land cover, and geology. LSMs were prepared using information value (Info Val), and Fuzzy Expert System (FES) models. Info Val is a statistical bivariate method, in which information values were calculated as the ratio of the landslide pixels per factor class (Si/Ni) to the total landslide pixel per parameter (S/N). Using this information values all parameters were reclassified and then summed in GIS to obtain the landslide susceptibility index (LSI) map. The FES method is a machine learning technique based on ‘mean and neighbour’ strategy for the construction of fuzzifier (input) and defuzzifier (output) membership function (MF) structure, and the FR method is used for formulating if-then rules. Two types of membership structures were utilized for membership function Bell-Gaussian (BG) and Trapezoidal-Triangular (TT). LSI for BG and TT were obtained applying membership function and if-then rules in MATLAB. The final LSMs were spatially and statistically validated. The validation results showed that in terms of accuracy, Info Val (83.4%) is better than BG (83.0%) and TT (82.6%), whereas, in terms of spatial distribution, BG is best. Hence, considering both statistical and spatial accuracy, BG is the most accurate one.

Keywords: bivariate statistical techniques, BG and TT membership structure, fuzzy expert system, information value method, machine learning technique

Procedia PDF Downloads 127
5210 Field Emission Scanning Microscope Image Analysis for Porosity Characterization of Autoclaved Aerated Concrete

Authors: Venuka Kuruwita Arachchige Don, Mohamed Shaheen, Chris Goodier

Abstract:

Aerated autoclaved concrete (AAC) is known for its lightweight, easy handling, high thermal insulation, and extremely porous structure. Investigation of pore behavior in AAC is crucial for characterizing the material, standardizing design and production techniques, enhancing the mechanical, durability, and thermal performance, studying the effectiveness of protective measures, and analyzing the effects of weather conditions. The significant details of pores are complicated to observe with acknowledged accuracy. The High-resolution Field Emission Scanning Electron Microscope (FESEM) image analysis is a promising technique for investigating the pore behavior and density of AAC, which is adopted in this study. Mercury intrusion porosimeter and gas pycnometer were employed to characterize porosity distribution and density parameters. The analysis considered three different densities of AAC blocks and three layers in the altitude direction within each block. A set of understandings was presented to extract and analyze the details of pore shape, pore size, pore connectivity, and pore percentages from FESEM images of AAC. Average pore behavior outcomes per unit area were presented. Comparison of porosity distribution and density parameters revealed significant variations. FESEM imaging offered unparalleled insights into porosity behavior, surpassing the capabilities of other techniques. The analysis conducted from a multi-staged approach provides porosity percentage occupied by various pore categories, total porosity, variation of pore distribution compared to AAC densities and layers, number of two-dimensional and three-dimensional pores, variation of apparent and matrix densities concerning pore behaviors, variation of pore behavior with respect to aluminum content, and relationship among shape, diameter, connectivity, and percentage in each pore classification.

Keywords: autoclaved aerated concrete, density, imaging technique, microstructure, porosity behavior

Procedia PDF Downloads 68
5209 H.263 Based Video Transceiver for Wireless Camera System

Authors: Won-Ho Kim

Abstract:

In this paper, a design of H.263 based wireless video transceiver is presented for wireless camera system. It uses standard WIFI transceiver and the covering area is up to 100m. Furthermore the standard H.263 video encoding technique is used for video compression since wireless video transmitter is unable to transmit high capacity raw data in real time and the implemented system is capable of streaming at speed of less than 1Mbps using NTSC 720x480 video.

Keywords: wireless video transceiver, video surveillance camera, H.263 video encoding digital signal processing

Procedia PDF Downloads 364
5208 Analytic Hierarchy Process

Authors: Hadia Rafi

Abstract:

To make any decision in any work/task/project it involves many factors that needed to be looked. The analytic Hierarchy process (AHP) is based on the judgments of experts to derive the required results this technique measures the intangibles and then by the help of judgment and software analysis the comparisons are made which shows how much a certain element/unit leads another. AHP includes how an inconsistent judgment should be made consistent and how the judgment should be improved when possible. The Priority scales are obtained by multiplying them with the priority of their parent node and after that they are added.

Keywords: AHP, priority scales, parent node, software analysis

Procedia PDF Downloads 406
5207 Using the Bootstrap for Problems Statistics

Authors: Brahim Boukabcha, Amar Rebbouh

Abstract:

The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.

Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models

Procedia PDF Downloads 380
5206 Development of Ketorolac Tromethamine Encapsulated Stealth Liposomes: Pharmacokinetics and Bio Distribution

Authors: Yasmin Begum Mohammed

Abstract:

Ketorolac tromethamine (KTM) is a non-steroidal anti-inflammatory drug with a potent analgesic and anti-inflammatory activity due to prostaglandin related inhibitory effect of drug. It is a non-selective cyclo-oxygenase inhibitor. The drug is currently used orally and intramuscularly in multiple divided doses, clinically for the management arthritis, cancer pain, post-surgical pain, and in the treatment of migraine pain. KTM has short biological half-life of 4 to 6 hours, which necessitates frequent dosing to retain the action. The frequent occurrence of gastrointestinal bleeding, perforation, peptic ulceration, and renal failure lead to the development of other drug delivery strategies for the appropriate delivery of KTM. The ideal solution would be to target the drug only to the cells or tissues affected by the disease. Drug targeting could be achieved effectively by liposomes that are biocompatible and biodegradable. The aim of the study was to develop a parenteral liposome formulation of KTM with improved efficacy while reducing side effects by targeting the inflammation due to arthritis. PEG-anchored (stealth) and non-PEG-anchored liposomes were prepared by thin film hydration technique followed by extrusion cycle and characterized for in vitro and in vivo. Stealth liposomes (SLs) exhibited increase in percent encapsulation efficiency (94%) and 52% percent of drug retention during release studies in 24 h with good stability for a period of 1 month at -20°C and 4°C. SLs showed about maximum 55% of edema inhibition with significant analgesic effect. SLs produced marked differences over those of non-SL formulations with an increase in area under plasma concentration time curve, t₁/₂, mean residence time, and reduced clearance. 0.3% of the drug was detected in arthritic induced paw with significantly reduced drug localization in liver, spleen, and kidney for SLs when compared to other conventional liposomes. Thus SLs help to increase the therapeutic efficacy of KTM by increasing the targeting potential at the inflammatory region.

Keywords: biodistribution, ketorolac tromethamine, stealth liposomes, thin film hydration technique

Procedia PDF Downloads 295
5205 A Multi Criteria Approach for Prioritization of Low Volume Rural Roads for Maintenance and Improvement

Authors: L. V. S. S. Phaneendra Bolem, S. Shankar

Abstract:

Low Volume Rural Roads (LVRRs) constitute an integral component of the road system in all countries. These encompass all aspects of the social and economic development of rural communities. It is known that on a worldwide basis the number of low traffic roads far exceeds the length of high volume roads. Across India, 90% of the roads are LVRRs, and they often form the most important link in terms of providing access to educational, medical, recreational and commercial activities in local and regional areas. In the recent past, Government of India (GoI), with the initiation of the ambitious programme namely 'Pradhan Mantri Gram Sadak Yojana' (PMGSY) gave greater importance to LVRRs realizing their role in economic development of rural communities. The vast expansion of the road network has brought connectivity to the rural areas of the country. Further, it is noticed that due to increasing axle loads and lack of timely maintenance, is accelerated the process of deterioration of LVRRs. In addition to this due to limited budget for maintenance of these roads systematic and scientific approach in utilizing the available resources has been necessitated. This would enable better prioritization and ranking for the maintenance and make ‘all-weather roads’. Taking this into account the present study has adopted a multi-criteria approach. The multi-criteria approach includes parameters such as social, economic, environmental and pavement condition as the main criterion and some sub-criteria to find the best suitable parameters and their weight. For this purpose the expert’s opinion survey was carried out using Delphi Technique (DT) considering Likert scale, pairwise comparison and ranking methods and entire data was analyzed. Finally, this study developed the maintenance criterion considering the socio-economic, environmental and pavement condition parameters for effective maintenance of low volume roads based on the engineering judgment.

Keywords: Delphi technique, experts opinion survey, low volume rural road maintenance, multi criteria analysis

Procedia PDF Downloads 166
5204 Acrylamide-Induced Thoracic Spinal Cord Axonopathy

Authors: Afshin Zahedi, Keivan Jamshidi

Abstract:

This study was conducted to determine the neurotoxic effects of different doses of ACR on the thoracic axons of the spinal cord of rat. To evaluate this hypothesis in the thoracic axons, amino-cupric silver staining technique of the de Olmos was conducted to define the histopathologic characteristic (argyrophilia) of axonal damage following ACR exposure. For this purpose 60 adult male rats (Wistar, approximately 250 g) were selected. Rats were hosed in polycarbonate boxes as two per each. Randomly assigned groups of rats (10 rats per exposure group, total 5 exposure groups as A, B, C, D and E) were exposed to 0.5, 5, 50, 100 and 500 mg/kg per day×11days intraperitoneal injection (IP injection) respectively. The remaining 10 rats were housed in group (F) as control group. Control rats received daily injections of 0.9% saline (3ml/kg). As indices of developing neurotoxicity, weight gain, gait scores and landing hindlimb foot splay (LHF) were determined. Weight gains were measured daily prior to injection. Gait scoring involved observation of spontaneous open field locomotion, included evaluations of ataxia, hopping, rearing and hind foot placement, and hindlimb foot splay were determined 3-4 times per week. Gait score was assigned from 1-4. After 11 days, two rats for silver stain, were randomly selected, dissected and proper samples were collected from thoracic portion of the spinal cord of rat. Results did show no neurological behavior in groups A, B and F, whereas severe neurotoxicity was observed in groups C and D. Rats in groups E died within 1-2 hours due to severe toxemia. In histopathological studies based on the de Olmos technique no argyrophilic neurons or processes were observed in stained sections obtained from the thoracic portion of the spinal cord of rats belong to groups A, B and F, while moderate to severe argyrophilic changes were observed in different stained sections obtained from the thoracic portion of the spinal cord of rats belong to groups C and D.

Keywords: acrylamide, rat, axonopathy, argyrophily, de Olmos

Procedia PDF Downloads 341
5203 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals

Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor

Abstract:

This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.

Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers

Procedia PDF Downloads 75
5202 Enhanced Efficiency for Propagation of Phalaenopsis cornu-cervi (Breda) Blume & Rchb. F. Using Trimmed Leaf Technique

Authors: Suphat Rittirat, Sutha Klaocheed, Kanchit Thammasiri

Abstract:

The effects of thidiazuron (TDZ) and benzyladenine (BA) on protocorm-like bodies (PLBs) induction from leaf explants was investigated. It was found that TDZ was superior to BA. The highest percentage and number of PLBs per leaf explant at 30 and 5.3 respectively were obtained on ½ MS medium supplemented with 9µM TDZ. The regenerated plantlets were potted and acclimatized in the greenhouse. These plants grew well and developed into normal plants after 3 month of transplantation. The 100% survival of plantlets was achieved when planted on pots containing sphagnum moss.

Keywords: orchid, PLBs, sphagnum moss, thidiazuron

Procedia PDF Downloads 327
5201 Utilizing Spatial Uncertainty of On-The-Go Measurements to Design Adaptive Sampling of Soil Electrical Conductivity in a Rice Field

Authors: Ismaila Olabisi Ogundiji, Hakeem Mayowa Olujide, Qasim Usamot

Abstract:

The main reasons for site-specific management for agricultural inputs are to increase the profitability of crop production, to protect the environment and to improve products’ quality. Information about the variability of different soil attributes within a field is highly essential for the decision-making process. Lack of fast and accurate acquisition of soil characteristics remains one of the biggest limitations of precision agriculture due to being expensive and time-consuming. Adaptive sampling has been proven as an accurate and affordable sampling technique for planning within a field for site-specific management of agricultural inputs. This study employed spatial uncertainty of soil apparent electrical conductivity (ECa) estimates to identify adaptive re-survey areas in the field. The original dataset was grouped into validation and calibration groups where the calibration group was sub-grouped into three sets of different measurements pass intervals. A conditional simulation was performed on the field ECa to evaluate the ECa spatial uncertainty estimates by the use of the geostatistical technique. The grouping of high-uncertainty areas for each set was done using image segmentation in MATLAB, then, high and low area value-separate was identified. Finally, an adaptive re-survey was carried out on those areas of high-uncertainty. Adding adaptive re-surveying significantly minimized the time required for resampling whole field and resulted in ECa with minimal error. For the most spacious transect, the root mean square error (RMSE) yielded from an initial crude sampling survey was minimized after an adaptive re-survey, which was close to that value of the ECa yielded with an all-field re-survey. The estimated sampling time for the adaptive re-survey was found to be 45% lesser than that of all-field re-survey. The results indicate that designing adaptive sampling through spatial uncertainty models significantly mitigates sampling cost, and there was still conformity in the accuracy of the observations.

Keywords: soil electrical conductivity, adaptive sampling, conditional simulation, spatial uncertainty, site-specific management

Procedia PDF Downloads 132
5200 Improving Public Sectors’ Policy Direction on Large Infrastructure Investment Projects: A Developmental Approach

Authors: Ncedo Cameron Xhala

Abstract:

Several public sector institutions lack policy direction on how to successfully implement their large infrastructure investment projects. It is significant to improve strategic policy direction in public sector institutions in order to improve planning, management and implementation of large infrastructure investment projects. It is significant to improve an understanding of internal and external pressures that exerts pressure on large infrastructure projects. The significance is to fulfill the public sector’s mandate, align the sectors’ scarce resources, stakeholders and to improve project management processes. The study used a case study approach which was underpinned by a constructionist approach. The study used a theoretical sampling technique when selecting study participants, and was followed by a snowball sampling technique that was used to select an identified case study project purposefully. The study was qualitative in nature, collected and analyzed qualitative empirical data from the purposefully selected five subject matter experts and has analyzed the case study documents. The study used a semi-structured interview approach, analysed case study documents in a qualitative approach. The interviews were on a face-to-face basis and were guided by an interview guide with focused questions. The study used a three coding process step comprising of one to three steps when analysing the qualitative empirical data. Findings reveal that an improvement of strategic policy direction in public sector institutions improves the integration in planning, management and on implementation on large infrastructure investment projects. Findings show the importance of understanding the external and internal pressures when implementing public sector’s large infrastructure investment projects. The study concludes that strategic policy direction in public sector institutions results in improvement of planning, financing, delivery, monitoring and evaluation and successful implementation of the public sector’s large infrastructure investment projects.

Keywords: implementation, infrastructure, investment, management

Procedia PDF Downloads 151
5199 Microfluidic Device for Real-Time Electrical Impedance Measurements of Biological Cells

Authors: Anil Koklu, Amin Mansoorifar, Ali Beskok

Abstract:

Dielectric spectroscopy (DS) is a noninvasive, label free technique for a long term real-time measurements of the impedance spectra of biological cells. DS enables characterization of cellular dielectric properties such as membrane capacitance and cytoplasmic conductivity. We have developed a lab-on-a-chip device that uses an electro-activated microwells array for loading, DS measurements, and unloading of biological cells. We utilized from dielectrophoresis (DEP) to capture target cells inside the wells and release them after DS measurement. DEP is a label-free technique that exploits differences among dielectric properties of the particles. In detail, DEP is the motion of polarizable particles suspended in an ionic solution and subjected to a spatially non-uniform external electric field. To the best of our knowledge, this is the first microfluidic chip that combines DEP and DS to analyze biological cells using electro-activated wells. Device performance is tested using two different cell lines of prostate cancer cells (RV122, PC-3). Impedance measurements were conducted at 0.2 V in the 10 kHz to 40 MHz range with 6 s time resolution. An equivalent circuit model was developed to extract the cell membrane capacitance and cell cytoplasmic conductivity from the impedance spectra. We report the time course of the variations in dielectric properties of PC-3 and RV122 cells suspended in low conductivity medium (LCB), which enhances dielectrophoretic and impedance responses, and their response to sudden pH change from a pH of 7.3 to a pH of 5.8. It is shown that microfluidic chip allowed online measurements of dielectric properties of prostate cancer cells and the assessment of the cellular level variations under external stimuli such as different buffer conductivity and pH. Based on these data, we intend to deploy the current device for single cell measurements by fabricating separately addressable N × N electrode platforms. Such a device will allow time-dependent dielectric response measurements for individual cells with the ability of selectively releasing them using negative-DEP and pressure driven flow.

Keywords: microfluidic, microfabrication, lab on a chip, AC electrokinetics, dielectric spectroscopy

Procedia PDF Downloads 151
5198 Combining Experiments and Surveys to Understand the Pinterest User Experience

Authors: Jolie M. Martin

Abstract:

Running experiments while logging detailed user actions has become the standard way of testing product features at Pinterest, as at many other Internet companies. While this technique offers plenty of statistical power to assess the effects of product changes on behavioral metrics, it does not often give us much insight into why users respond the way they do. By combining at-scale experiments with smaller surveys of users in each experimental condition, we have developed a unique approach for measuring the impact of our product and communication treatments on user sentiment, attitudes, and comprehension.

Keywords: experiments, methodology, surveys, user experience

Procedia PDF Downloads 311
5197 An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model

Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier

Abstract:

Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.

Keywords: human motion recognition, motion representation, Laban Movement Analysis, Discrete Hidden Markov Model

Procedia PDF Downloads 207
5196 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases

Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo

Abstract:

The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.

Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis

Procedia PDF Downloads 222
5195 Structural Health Monitoring using Fibre Bragg Grating Sensors in Slab and Beams

Authors: Pierre van Tonder, Dinesh Muthoo, Kim twiname

Abstract:

Many existing and newly built structures are constructed on the design basis of the engineer and the workmanship of the construction company. However, when considering larger structures where more people are exposed to the building, its structural integrity is of great importance considering the safety of its occupants (Raghu, 2013). But how can the structural integrity of a building be monitored efficiently and effectively. This is where the fourth industrial revolution step in, and with minimal human interaction, data can be collected, analysed, and stored, which could also give an indication of any inconsistencies found in the data collected, this is where the Fibre Bragg Grating (FBG) monitoring system is introduced. This paper illustrates how data can be collected and converted to develop stress – strain behaviour and to produce bending moment diagrams for the utilisation and prediction of the structure’s integrity. Embedded fibre optic sensors were used in this study– fibre Bragg grating sensors in particular. The procedure entailed making use of the shift in wavelength demodulation technique and an inscription process of the phase mask technique. The fibre optic sensors considered in this report were photosensitive and embedded in the slab and beams for data collection and analysis. Two sets of fibre cables have been inserted, one purposely to collect temperature recordings and the other to collect strain and temperature. The data was collected over a time period and analysed used to produce bending moment diagrams to make predictions of the structure’s integrity. The data indicated the fibre Bragg grating sensing system proved to be useful and can be used for structural health monitoring in any environment. From the experimental data for the slab and beams, the moments were found to be64.33 kN.m, 64.35 kN.m and 45.20 kN.m (from the experimental bending moment diagram), and as per the idealistic (Ultimate Limit State), the data of 133 kN.m and 226.2 kN.m were obtained. The difference in values gave room for an early warning system, in other words, a reserve capacity of approximately 50% to failure.

Keywords: fibre bragg grating, structural health monitoring, fibre optic sensors, beams

Procedia PDF Downloads 139
5194 Efficacy of Gamma Radiation on the Productivity of Bactrocera oleae Gmelin (Diptera: Tephritidae)

Authors: Mehrdad Ahmadi, Mohamad Babaie, Shiva Osouli, Bahareh Salehi, Nadia Kalantaraian

Abstract:

The olive fruit fly, Bactrocera oleae Gmelin (Diptera: Tephritidae), is one of the most serious pests in olive orchards in growing province in Iran. The female lay eggs in green olive fruit and larvae hatch inside the fruit, where they feed upon the fruit matters. One of the main ecologically friendly and species-specific systems of pest control is the sterile insect technique (SIT) which is based on the release of large numbers of sterilized insects. The objective of our work was to develop a SIT against B. oleae by using of gamma radiation for the laboratory and field trial in Iran. Oviposition of female mated by irradiated males is one of the main parameters to determine achievement of SIT. To conclude the sterile dose, pupae were placed under 0 to 160 Gy of gamma radiation. The main factor in SIT is the productivity of females which are mated by irradiated males. The emerged adults from irradiated pupae were mated with untreated adults of the same age by confining them inside the transparent cages. The fecundity of the irradiated males mated with non-irradiated females was decreased with the increasing radiation dose level. It was observed that the number of eggs and also the percentage of the egg hatching was significantly (P < 0.05) affected in either IM x NF crosses compared with NM x NF crosses in F1 generation at all doses. Also, the statistical analysis showed a significant difference (P < 0.05) in the mean number of eggs laid between irradiated and non-irradiated females crossed with irradiated males, which suggests that the males were susceptible to gamma radiation. The egg hatching percentage declined markedly with the increase of the radiation dose of the treated males in mating trials which demonstrated that egg hatch rate was dose dependent. Our results specified that gamma radiation affects the longevity of irradiated B. oleae larvae (established from irradiated pupae) and significantly increased their larval duration. Results show the gamma radiation, and SIT can be used successfully against olive fruit flies.

Keywords: fertility, olive fruit fly, radiation, sterile insect technique

Procedia PDF Downloads 196