Search results for: e-content producing algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4961

Search results for: e-content producing algorithm

1781 Modelling of Multi-Agent Systems for the Scheduling of Multi-EV Charging from Power Limited Sources

Authors: Manan’Iarivo Rasolonjanahary, Chris Bingham, Nigel Schofield, Masoud Bazargan

Abstract:

This paper presents the research and application of model predictive scheduled charging of electric vehicles (EV) subject to limited available power resource. To focus on algorithm and operational characteristics, the EV interface to the source is modelled as a battery state equation during the charging operation. The researched methods allow for the priority scheduling of EV charging in a multi-vehicle regime and when subject to limited source power availability. Priority attribution for each connected EV is described. The validity of the developed methodology is shown through the simulation of different scenarios of charging operation of multiple connected EVs including non-scheduled and scheduled operation with various numbers of vehicles. Performance of the developed algorithms is also reported with the recommendation of the choice of suitable parameters.

Keywords: model predictive control, non-scheduled, power limited sources, scheduled and stop-start battery charging

Procedia PDF Downloads 158
1780 Lithological Mapping and Iron Deposits Identification in El-Bahariya Depression, Western Desert, Egypt, Using Remote Sensing Data Analysis

Authors: Safaa M. Hassan; Safwat S. Gabr, Mohamed F. Sadek

Abstract:

This study is proposed for the lithological and iron oxides detection in the old mine areas of El-Bahariya Depression, Western Desert, using ASTER and Landsat-8 remote sensing data. Four old iron ore occurrences, namely; El-Gedida, El-Haraa, Ghurabi, and Nasir mine areas found in the El-Bahariya area. This study aims to find new high potential areas for iron mineralization around El-Baharyia depression. Image processing methods such as principle component analysis (PCA) and band ratios (b4/b5, b5/b6, b6/b7, and 4/2, 6/7, band 6) images were used for lithological identification/mapping that includes the iron content in the investigated area. ASTER and Landsat-8 visible and short-wave infrared data found to help mapping the ferruginous sandstones, iron oxides as well as the clay minerals in and around the old mines area of El-Bahariya depression. Landsat-8 band ratio and the principle component of this study showed well distribution of the lithological units, especially ferruginous sandstones and iron zones (hematite and limonite) along with detection of probable high potential areas for iron mineralization which can be used in the future and proved the ability of Landsat-8 and ASTER data in mapping these features. Minimum Noise Fraction (MNF), Mixture Tuned Matched Filtering (MTMF), pixel purity index methods as well as Spectral Ange Mapper classifier algorithm have been successfully discriminated the hematite and limonite content within the iron zones in the study area. Various ASTER image spectra and ASD field spectra of hematite and limonite and the surrounding rocks are compared and found to be consistent in terms of the presence of absorption features at range from 1.95 to 2.3 μm for hematite and limonite. Pixel purity index algorithm and two sub-pixel spectral methods, namely Mixture Tuned Matched Filtering (MTMF) and matched filtering (MF) methods, are applied to ASTER bands to delineate iron oxides (hematite and limonite) rich zones within the rock units. The results are validated in the field by comparing image spectra of spectrally anomalous zone with the USGS resampled laboratory spectra of hematite and limonite samples using ASD measurements. A number of iron oxides rich zones in addition to the main surface exposures of the El-Gadidah Mine, are confirmed in the field. The proposed method is a successful application of spectral mapping of iron oxides deposits in the exposed rock units (i.e., ferruginous sandstone) and present approach of both ASTER and ASD hyperspectral data processing can be used to delineate iron-rich zones occurring within similar geological provinces in any parts of the world.

Keywords: Landsat-8, ASTER, lithological mapping, iron exploration, western desert

Procedia PDF Downloads 146
1779 Improved Clothing Durability as a Lifespan Extension Strategy: A Framework for Measuring Clothing Durability

Authors: Kate E Morris, Mark Sumner, Mark Taylor, Amanda Joynes, Yue Guo

Abstract:

Garment durability, which encompasses physical and emotional factors, has been identified as a critical ingredient in producing clothing with increased lifespans, battling overconsumption, and subsequently tackling the catastrophic effects of climate change. Eco-design for Sustainable Products Regulation (ESPR) and Extended Producer Responsibility (EPR) schemes have been suggested and will be implemented across Europe and the UK which might require brands to declare a garment’s durability credentials to be able to sell in that market. There is currently no consistent method of measuring the overall durability of a garment. Measuring the physical durability of garments is difficult and current assessment methods lack objectivity and reliability or don’t reflect the complex nature of durability for different garment categories. This study presents a novel and reproducible methodology for testing and ranking the absolute durability of 5 commercially available garment types, Formal Trousers, Casual Trousers, Denim Jeans, Casual Leggings and Underwear. A total of 112 garments from 21 UK brands were assessed. Due to variations in end use, different factors were considered across the different garment categories when evaluating durability. A physical testing protocol was created, tailored to each category, to dictate the necessary test results needed to measure the absolute durability of the garments. Multiple durability factors were used to modulate the ranking as opposed to previous studies which only reported on single factors to evaluate durability. The garments in this study were donated by the signatories of the Waste Resource Action Programme’s (WRAP) Textile 2030 initiative as part of their strategy to reduce the environmental impact of UK fashion. This methodology presents a consistent system for brands and policymakers to follow to measure and rank various garment type’s physical durability. Furthermore, with such a methodology, the durability of garments can be measured and new standards for improving durability can be created to enhance utilisation and improve the sustainability of the clothing on the market.

Keywords: circularity, durability, garment testing, ranking

Procedia PDF Downloads 39
1778 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors

Authors: Susana Aragoneses Garrido

Abstract:

Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.

Keywords: analysis vehicles, asssesment, ergonomics, car redesign

Procedia PDF Downloads 336
1777 Persistent Homology of Convection Cycles in Network Flows

Authors: Minh Quang Le, Dane Taylor

Abstract:

Convection is a well-studied topic in fluid dynamics, yet it is less understood in the context of networks flows. Here, we incorporate techniques from topological data analysis (namely, persistent homology) to automate the detection and characterization of convective/cyclic/chiral flows over networks, particularly those that arise for irreversible Markov chains (MCs). As two applications, we study convection cycles arising under the PageRank algorithm, and we investigate chiral edges flows for a stochastic model of a bi-monomer's configuration dynamics. Our experiments highlight how system parameters---e.g., the teleportation rate for PageRank and the transition rates of external and internal state changes for a monomer---can act as homology regularizers of convection, which we summarize with persistence barcodes and homological bifurcation diagrams. Our approach establishes a new connection between the study of convection cycles and homology, the branch of mathematics that formally studies cycles, which has diverse potential applications throughout the sciences and engineering.

Keywords: homology, persistent homolgy, markov chains, convection cycles, filtration

Procedia PDF Downloads 139
1776 Designing the Management Plan for Health Care (Medical) Wastes in the Cities of Semnan, Mahdishahr and Shahmirzad

Authors: Rasouli Divkalaee Zeinab, Kalteh Safa, Roudbari Aliakbar

Abstract:

Introduction: Medical waste can lead to the generation and transmission of many infectious and contagious diseases due to the presence of pathogenic agents, thereby necessitating the need for special management to collect, decontaminate, and finally dispose of such products. This study aimed to design a centralized health care (medical) waste management program for the cities of Semnan, Mahdishahr, and Shahmirzad. Methods: This descriptive-analytical study was conducted for six months in the cities of Semnan, Mahdishahr, and Shahmirzad. In this study, the quantitative and qualitative characteristics of the generated wastes were determined by taking samples from all medical waste production centers. Then, the equipment, devices, and machines required for separate collection of the waste from the production centers and for their subsequent decontamination were estimated. Next, the investment costs, current costs, and working capital required for collection, decontamination, and final disposal of the wastes were determined. Finally, the payment for proper waste management of each category of medical waste-producing centers was determined. Results: 1021 kilograms of medical waste are produced daily in the cities of Semnan, Mahdishahr, and Shahmirzad. It was estimated that a 1000-liter autoclave, a machine for collecting medical waste, four 60-liter bins, four 120-liter bins, and four 1200-liter bins were required for implementing the study plan. Also, the estimated total annual medical waste management costs for Semnan City were determined (23,283,903,720 Iranian Rials). Conclusion: The study results showed that establishing a proper management system for medical wastes generated in the three studied cities will cost between 334,280 and 1,253,715 Iranian Rials in fees for the medical centers. The findings of this study provided comprehensive data regarding medical wastes from the generation point to the landfill site, which is vital for the government and the private sector.

Keywords: clinics, decontamination, management, medical waste

Procedia PDF Downloads 79
1775 A Model Based Metaheuristic for Hybrid Hierarchical Community Structure in Social Networks

Authors: Radhia Toujani, Jalel Akaichi

Abstract:

In recent years, the study of community detection in social networks has received great attention. The hierarchical structure of the network leads to the emergence of the convergence to a locally optimal community structure. In this paper, we aim to avoid this local optimum in the introduced hybrid hierarchical method. To achieve this purpose, we present an objective function where we incorporate the value of structural and semantic similarity based modularity and a metaheuristic namely bees colonies algorithm to optimize our objective function on both hierarchical level divisive and agglomerative. In order to assess the efficiency and the accuracy of the introduced hybrid bee colony model, we perform an extensive experimental evaluation on both synthetic and real networks.

Keywords: social network, community detection, agglomerative hierarchical clustering, divisive hierarchical clustering, similarity, modularity, metaheuristic, bee colony

Procedia PDF Downloads 380
1774 Effects of Sintering Temperature on Microstructure and Mechanical Properties of Nanostructured Ni-17Cr Alloy

Authors: B. J. Babalola, M. B. Shongwe

Abstract:

Spark Plasma Sintering technique is a novel processing method that produces limited grain growth and highly dense variety of materials; alloys, superalloys, and carbides just to mention a few. However, initial particle size and spark plasma sintering parameters are factors which influence the grain growth and mechanical properties of sintered materials. Ni-Cr alloys are regarded as the most promising alloys for aerospace turbine blades, owing to the fact that they meet the basic requirements of desirable mechanical strength at high temperatures and good resistance to oxidation. The conventional method of producing this alloy often results in excessive grain growth and porosity levels that are detrimental to its mechanical properties. The effect of sintering temperature was evaluated on the microstructure and mechanical properties of the nanostructured Ni-17Cr alloy. Nickel and chromium powder were milled using high energy ball milling independently for 30 hours, milling speed of 400 revs/min and ball to powder ratio (BPR) of 10:1. The milled powders were mixed in the composition of Nickel having 83 wt % and chromium, 17 wt %. This was sintered at varied temperatures from 800°C, 900°C, 1000°C, 1100°C and 1200°C. The structural characteristics such as porosity, grain size, fracture surface and hardness were analyzed by scan electron microscopy and X-ray diffraction, Archimedes densitometry, micro-hardness tester. The corresponding results indicated an increase in the densification and hardness property of the alloy as the temperature increases. The residual porosity of the alloy reduces with respect to the sintering temperature and in contrast, the grain size was enhanced. The study of the mechanical properties, including hardness, densification shows that optimum properties were obtained for the sintering temperature of 1100°C. The advantages of high sinterability of Ni-17Cr alloy using milled powders and microstructural details were discussed.

Keywords: densification, grain growth, milling, nanostructured materials, sintering temperature

Procedia PDF Downloads 402
1773 Exploring Marine Bacteria in the Arabian Gulf Region for Antimicrobial Metabolites

Authors: Julie Connelly, Tanvi Toprani, Xin Xie, Dhinoth Kumar Bangarusamy, Kris C. Gunsalus

Abstract:

The overuse of antibiotics worldwide has contributed to the development of multi-drug resistant (MDR) pathogenic bacterial strains. There is an increasing urgency to discover antibiotics to combat MDR pathogens. The microbiome of the Arabian Gulf is a largely unexplored and potentially rich source of novel bioactive compounds. Microbes that inhabit the Abu Dhabi coastal regions adapt to extreme environments with high salinity, hot temperatures, large temperature fluctuations, and acute exposure to solar energy. The microbes native to this region may produce unique metabolites with therapeutic potential as antibiotics and antifungals. We have isolated 200 pure bacterial strains from mangrove sediments, cyanobacterial mats, and coral reefs of the Abu Dhabi region. In this project, we aim to screen the marine bacterial strains to identify antibiotics, in particular undocumented compounds that show activity against existing antibiotic-resistant strains. We have acquired the ESKAPE pathogen panel, which consists of six antibiotic-resistant gram-positive and gram-negative bacterial pathogens that collectively cause most clinical infections. Our initial efforts of the primary screen using colony-picking co-culture assay have identified several candidate marine strains producing potential antibiotic compounds. We will next apply different assays, including disk-diffusion and broth turbidity growth assay, to confirm the results. This will be followed by bioactivity-guided purification and characterization of target compounds from the scaled-up volume of candidate strains, including SPE fraction, HPLC fraction, LC-MS, and NMR. For antimicrobial compounds with unknown structures, our final goal is to investigate their mode of action by identifying the molecular target.

Keywords: marine bacteria, natural products, drug discovery, ESKAPE panel

Procedia PDF Downloads 75
1772 Prediction of Solidification Behavior of Al Alloy in a Cube Mold Cavity

Authors: N. P. Yadav, Deepti Verma

Abstract:

This paper focuses on the mathematical modeling for solidification of Al alloy in a cube mould cavity to study the solidification behavior of casting process. The parametric investigation of solidification process inside the cavity was performed by using computational solidification/melting model coupled with Volume of fluid (VOF) model. The implicit filling algorithm is used in this study to understand the overall process from the filling stage to solidification in a model metal casting process. The model is validated with past studied at same conditions. The solidification process are analyzed by including the effect of pouring velocity and temperature of liquid metal, effect of wall temperature as well natural convection from the wall and geometry of the cavity. These studies show the possibility of various defects during solidification process.

Keywords: buoyancy driven flow, natural convection driven flow, residual flow, secondary flow, volume of fluid

Procedia PDF Downloads 417
1771 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines

Authors: Alexander Guzman Urbina, Atsushi Aoyama

Abstract:

The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.

Keywords: deep learning, risk assessment, neuro fuzzy, pipelines

Procedia PDF Downloads 292
1770 Message Passing Neural Network (MPNN) Approach to Multiphase Diffusion in Reservoirs for Well Interconnection Assessments

Authors: Margarita Mayoral-Villa, J. Klapp, L. Di G. Sigalotti, J. E. V. Guzmán

Abstract:

Automated learning techniques are widely applied in the energy sector to address challenging problems from a practical point of view. To this end, we discuss the implementation of a Message Passing algorithm (MPNN)within a Graph Neural Network(GNN)to leverage the neighborhood of a set of nodes during the aggregation process. This approach enables the characterization of multiphase diffusion processes in the reservoir, such that the flow paths underlying the interconnections between multiple wells may be inferred from previously available data on flow rates and bottomhole pressures. The results thus obtained compare favorably with the predictions produced by the Reduced Order Capacitance-Resistance Models (CRM) and suggest the potential of MPNNs to enhance the robustness of the forecasts while improving the computational efficiency.

Keywords: multiphase diffusion, message passing neural network, well interconnection, interwell connectivity, graph neural network, capacitance-resistance models

Procedia PDF Downloads 149
1769 Location Detection of Vehicular Accident Using Global Navigation Satellite Systems/Inertial Measurement Units Navigator

Authors: Neda Navidi, Rene Jr. Landry

Abstract:

Vehicle tracking and accident recognizing are considered by many industries like insurance and vehicle rental companies. The main goal of this paper is to detect the location of a car accident by combining different methods. The methods, which are considered in this paper, are Global Navigation Satellite Systems/Inertial Measurement Units (GNSS/IMU)-based navigation and vehicle accident detection algorithms. They are expressed by a set of raw measurements, which are obtained from a designed integrator black box using GNSS and inertial sensors. Another concern of this paper is the definition of accident detection algorithm based on its jerk to identify the position of that accident. In fact, the results convinced us that, even in GNSS blockage areas, the position of the accident could be detected by GNSS/INS integration with 50% improvement compared to GNSS stand alone.

Keywords: driver behavior monitoring, integration, IMU, GNSS, monitoring, tracking

Procedia PDF Downloads 237
1768 Payment Subsidies for Environmentally-Friendly Agriculture on Rice Production in Japan

Authors: Danielle Katrina Santos, Koji Shimada

Abstract:

Environmentally-friendly agriculture has been promoted for over two decades as a response to the environmental challenges brought by climate change and biological loss. Located above the equator, it is possible that Japan may benefit from future climate change, yet Japan is also a rarely developed country located in the Asian Monsoon climate region, making it vulnerable to the impacts of climate change. In this regard, the Japanese government has initiated policies to adapt to the adverse effects of climate change through the promotion and popularization of environmentally-friendly farming practices. This study aims to determine profit efficiency among environmentally-friendly rice farmers in Shiga Prefecture using the Stochastic Frontier Approach. A cross-sectional survey was conducted among 66 farmers from top rice-producing cities through a structured questionnaire. Results showed that the gross farm income of environmentally-friendly rice farmers was higher by JPY 316,223.00/ha. Production costs were also found to be higher among environmentally-friendly rice farmers, especially on labor costs, which accounted for 32% of the total rice production cost. The resulting net farm income of environmentally-friendly rice farmers was only higher by JPY 18,044/ha. Results from the stochastic frontier analysis further showed that the profit efficiency of conventional farmers was only 69% as compared to environmentally-friendly rice farmers who had a profit efficiency of 76%. Furthermore, environmentally-friendly agriculture participation, other types of subsidy, educational level, and farm size were significant factors positively influencing profit efficiency. The study concluded that substitution of environmentally-friendly agriculture for conventional rice farming would result in an increased profit efficiency due to the direct payment subsidy and price premium received. The direct government policies that would strengthen the popularization of environmentally-friendly agriculture to increase the production of environmentally-friendly products and reduce pollution load to the Lake Biwa ecosystem.

Keywords: profit efficiency, environmentally-friendly agriculture, rice farmers, direct payment subsidies

Procedia PDF Downloads 147
1767 Packaging in the Design Synthesis of Novel Aircraft Configuration

Authors: Paul Okonkwo, Howard Smith

Abstract:

A study to estimate the size of the cabin and major aircraft components as well as detect and avoid interference between internally placed components and the external surface, during the conceptual design synthesis and optimisation to explore the design space of a BWB, was conducted. Sizing of components follows the Bradley cabin sizing and rubber engine scaling procedures to size the cabin and engine respectively. The interference detection and avoidance algorithm relies on the ability of the Class Shape Transform parameterisation technique to generate polynomial functions of the surfaces of a BWB aircraft configuration from the sizes of the cabin and internal objects using few variables. Interference detection is essential in packaging of non-conventional configuration like the BWB because of the non-uniform airfoil-shaped sections and resultant varying internal space. The unique configuration increases the need for a methodology to prevent objects from being placed in locations that do not sufficiently enclose them within the geometry.

Keywords: packaging, optimisation, BWB, parameterisation, aircraft conceptual design

Procedia PDF Downloads 463
1766 Processing and Characterization of Aluminum Matrix Composite Reinforced with Amorphous Zr₃₇.₅Cu₁₈.₆₇Al₄₃.₉₈ Phase

Authors: P. Abachi, S. Karami, K. Purazrang

Abstract:

The amorphous reinforcements (metallic glasses) can be considered as promising options for reinforcing light-weight aluminum and its alloys. By using the proper type of reinforcement, one can overcome to drawbacks such as interfacial de-cohesion and undesirable reactions which can be created at ceramic particle and metallic matrix interface. In this work, the Zr-based amorphous phase was produced via mechanical milling of elemental powders. Based on Miedema semi-empirical Model and diagrams for formation enthalpies and/or Gibbs free energies of Zr-Cu amorphous phase in comparison with the crystalline phase, the glass formability range was predicted. The composite was produced using the powder mixture of the aluminum and metallic glass and spark plasma sintering (SPS) at the temperature slightly above the glass transition Tg of the metallic glass particles. The selected temperature and rapid sintering route were suitable for consolidation of an aluminum matrix without crystallization of amorphous phase. To characterize amorphous phase formation, X-ray diffraction (XRD) phase analyses were performed on powder mixture after specified intervals of milling. The microstructure of the composite was studied by optical and scanning electron microscope (SEM). Uniaxial compression tests were carried out on composite specimens with the dimension of 4 mm long and a cross-section of 2 ˟ 2mm2. The micrographs indicated an appropriate reinforcement distribution in the metallic matrix. The comparison of stress–strain curves of the consolidated composite and the non-reinforced Al matrix alloy in compression showed that the enhancement of yield strength and mechanical strength are combined with an appreciable plastic strain at fracture. It can be concluded that metallic glasses (amorphous phases) are alternative reinforcement material for lightweight metal matrix composites capable of producing high strength and adequate ductility. However, this is in the expense of minor density increase.

Keywords: aluminum matrix composite, amorphous phase, mechanical alloying, spark plasma sintering

Procedia PDF Downloads 365
1765 Detection of High Fructose Corn Syrup in Honey by Near Infrared Spectroscopy and Chemometrics

Authors: Mercedes Bertotto, Marcelo Bello, Hector Goicoechea, Veronica Fusca

Abstract:

The National Service of Agri-Food Health and Quality (SENASA), controls honey to detect contamination by synthetic or natural chemical substances and establishes and controls the traceability of the product. The utility of near-infrared spectroscopy for the detection of adulteration of honey with high fructose corn syrup (HFCS) was investigated. First of all, a mixture of different authentic artisanal Argentinian honey was prepared to cover as much heterogeneity as possible. Then, mixtures were prepared by adding different concentrations of high fructose corn syrup (HFCS) to samples of the honey pool. 237 samples were used, 108 of them were authentic honey and 129 samples corresponded to honey adulterated with HFCS between 1 and 10%. They were stored unrefrigerated from time of production until scanning and were not filtered after receipt in the laboratory. Immediately prior to spectral collection, honey was incubated at 40°C overnight to dissolve any crystalline material, manually stirred to achieve homogeneity and adjusted to a standard solids content (70° Brix) with distilled water. Adulterant solutions were also adjusted to 70° Brix. Samples were measured by NIR spectroscopy in the range of 650 to 7000 cm⁻¹. The technique of specular reflectance was used, with a lens aperture range of 150 mm. Pretreatment of the spectra was performed by Standard Normal Variate (SNV). The ant colony optimization genetic algorithm sample selection (ACOGASS) graphical interface was used, using MATLAB version 5.3, to select the variables with the greatest discriminating power. The data set was divided into a validation set and a calibration set, using the Kennard-Stone (KS) algorithm. A combined method of Potential Functions (PF) was chosen together with Partial Least Square Linear Discriminant Analysis (PLS-DA). Different estimators of the predictive capacity of the model were compared, which were obtained using a decreasing number of groups, which implies more demanding validation conditions. The optimal number of latent variables was selected as the number associated with the minimum error and the smallest number of unassigned samples. Once the optimal number of latent variables was defined, we proceeded to apply the model to the training samples. With the calibrated model for the training samples, we proceeded to study the validation samples. The calibrated model that combines the potential function methods and PLSDA can be considered reliable and stable since its performance in future samples is expected to be comparable to that achieved for the training samples. By use of Potential Functions (PF) and Partial Least Square Linear Discriminant Analysis (PLS-DA) classification, authentic honey and honey adulterated with HFCS could be identified with a correct classification rate of 97.9%. The results showed that NIR in combination with the PT and PLS-DS methods can be a simple, fast and low-cost technique for the detection of HFCS in honey with high sensitivity and power of discrimination.

Keywords: adulteration, multivariate analysis, potential functions, regression

Procedia PDF Downloads 127
1764 Settlement Prediction for Tehran Subway Line-3 via FLAC3D and ANFIS

Authors: S. A. Naeini, A. Khalili

Abstract:

Nowadays, tunnels with different applications are developed, and most of them are related to subway tunnels. The excavation of shallow tunnels that pass under municipal utilities is very important, and the surface settlement control is an important factor in the design. The study sought to analyze the settlement and also to find an appropriate model in order to predict the behavior of the tunnel in Tehran subway line-3. The displacement in these sections is also determined by using numerical analyses and numerical modeling. In addition, the Adaptive Neuro-Fuzzy Inference System (ANFIS) method is utilized by Hybrid training algorithm. The database pertinent to the optimum network was obtained from 46 subway tunnels in Iran and Turkey which have been constructed by the new Austrian tunneling method (NATM) with similar parameters based on type of their soil. The surface settlement was measured, and the acquired results were compared to the predicted values. The results disclosed that computing intelligence is a good substitute for numerical modeling.

Keywords: settlement, Subway Line, FLAC3D, ANFIS Method

Procedia PDF Downloads 234
1763 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption

Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme

Procedia PDF Downloads 382
1762 Ensemble-Based SVM Classification Approach for miRNA Prediction

Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam

Abstract:

In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.

Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data

Procedia PDF Downloads 350
1761 Arabic Light Stemmer for Better Search Accuracy

Authors: Sahar Khedr, Dina Sayed, Ayman Hanafy

Abstract:

Arabic is one of the most ancient and critical languages in the world. It has over than 250 million Arabic native speakers and more than twenty countries having Arabic as one of its official languages. In the past decade, we have witnessed a rapid evolution in smart devices, social network and technology sector which led to the need to provide tools and libraries that properly tackle the Arabic language in different domains. Stemming is one of the most crucial linguistic fundamentals. It is used in many applications especially in information extraction and text mining fields. The motivation behind this work is to enhance the Arabic light stemmer to serve the data mining industry and leverage it in an open source community. The presented implementation works on enhancing the Arabic light stemmer by utilizing and enhancing an algorithm that provides an extension for a new set of rules and patterns accompanied by adjusted procedure. This study has proven a significant enhancement for better search accuracy with an average 10% improvement in comparison with previous works.

Keywords: Arabic data mining, Arabic Information extraction, Arabic Light stemmer, Arabic stemmer

Procedia PDF Downloads 311
1760 The Development of Open Access in Latin America and Caribbean: Mapping National and International Policies and Scientific Publications of the Region

Authors: Simone Belli, Sergio Minniti, Valeria Santoro

Abstract:

ICTs and technology transfer can benefit and move a country forward in economic and social development. However, ICT and access to the Internet have been inequitably distributed in most developing countries. In terms of science production and dissemination, this divide articulates itself also through the inequitable distribution of access to scientific knowledge and networks, which results in the exclusion of developing countries from the center of science. Developing countries are on the fringe of Science and Technology (S&T) production due not only to low investment in research but also to the difficulties to access international scholarly literature. In this respect, Open access (OA) initiatives and knowledge infrastructure represent key elements for both producing significant changes in scholarly communication and reducing the problems of developing countries. The spreading of the OA movement in the region, exemplified by the growth of regional and national initiatives, such as the creation of OA institutional repositories (e.g. SciELO and Redalyc) and the establishing of supportive governmental policies, provides evidence of the significant role that OA is playing in reducing the scientific gap between Latin American countries and improving their participation in the so-called ‘global knowledge commons’. In this paper, we map OA publications in Latin America and observe how Latin American countries are moving forward and becoming a leading force in widening access to knowledge. Our analysis, developed as part of the H2020 EULAC Focus research project, is based on mixed methods and consists mainly of a bibliometric analysis of OA publications indexed in the most important scientific databases (Web of Science and Scopus) and OA regional repositories, as well as the qualitative analysis of documents related to the main OA initiatives in Latin America. Through our analysis, we aim at reflecting critically on what policies, international standards, and best practices might be adapted to incorporate OA worldwide and improve the infrastructure of the global knowledge commons.

Keywords: open access, LAC countries, scientific publications, bibliometric analysis

Procedia PDF Downloads 215
1759 Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning

Authors: T. Bryan , V. Kepuska, I. Kostnaic

Abstract:

A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors.

Keywords: sparse dictionary learning, autoencoder, sparse autoencoder, basis vectors, atomic decomposition, envelope sampling, envelope samples, Gabor, gammatone, matching pursuit

Procedia PDF Downloads 254
1758 Efficiency Validation of Hybrid Geothermal and Radiant Cooling System Implementation in Hot and Humid Climate Houses of Saudi Arabia

Authors: Jamil Hijazi, Stirling Howieson

Abstract:

Over one-quarter of the Kingdom of Saudi Arabia’s total oil production (2.8 million barrels a day) is used for electricity generation. The built environment is estimated to consume 77% of the total energy production. Of this amount, air conditioning systems consume about 80%. Apart from considerations surrounding global warming and CO2 production it has to be recognised that oil is a finite resource and the KSA like many other oil rich countries will have to start to consider a horizon where hydro-carbons are not the dominant energy resource. The employment of hybrid ground cooling pipes in combination with black body solar collection and radiant night cooling systems may have the potential to displace a significant proportion of oil currently used to run conventional air conditioning plant. This paper presents an investigation into the viability of such hybrid systems with the specific aim of reducing carbon emissions while providing all year round thermal comfort in a typical Saudi Arabian urban housing block. At the outset air and soil temperatures were measured in the city of Jeddah. A parametric study then was carried out by computational simulation software (Design Builder) that utilised the field measurements and predicted the cooling energy consumption of both a base case and an ideal scenario (typical block retro-fitted with insulation, solar shading, ground pipes integrated with hypocaust floor slabs/ stack ventilation and radiant cooling pipes embed in floor).Initial simulation results suggest that careful ‘ecological design’ combined with hybrid radiant and ground pipe cooling techniques can displace air conditioning systems, producing significant cost and carbon savings (both capital and running) without appreciable deprivation of amenity.

Keywords: energy efficiency, ground pipe, hybrid cooling, radiative cooling, thermal comfort

Procedia PDF Downloads 263
1757 Artificial Steady-State-Based Nonlinear MPC for Wheeled Mobile Robot

Authors: M. H. Korayem, Sh. Ameri, N. Yousefi Lademakhi

Abstract:

To ensure the stability of closed-loop nonlinear model predictive control (NMPC) within a finite horizon, there is a need for appropriate design terminal ingredients, which can be a time-consuming and challenging effort. Otherwise, in order to ensure the stability of the control system, it is necessary to consider an infinite predictive horizon. Increasing the prediction horizon increases computational demand and slows down the implementation of the method. In this study, a new technique has been proposed to ensure system stability without terminal ingredients. This technique has been employed in the design of the NMPC algorithm, leading to a reduction in the computational complexity of designing terminal ingredients and computational burden. The studied system is a wheeled mobile robot (WMR) subjected to non-holonomic constraints. Simulation has been investigated for two problems: trajectory tracking and adjustment mode.

Keywords: wheeled mobile robot, nonlinear model predictive control, stability, without terminal ingredients

Procedia PDF Downloads 92
1756 Multiscale Modelization of Multilayered Bi-Dimensional Soils

Authors: I. Hosni, L. Bennaceur Farah, N. Saber, R Bennaceur

Abstract:

Soil moisture content is a key variable in many environmental sciences. Even though it represents a small proportion of the liquid freshwater on Earth, it modulates interactions between the land surface and the atmosphere, thereby influencing climate and weather. Accurate modeling of the above processes depends on the ability to provide a proper spatial characterization of soil moisture. The measurement of soil moisture content allows assessment of soil water resources in the field of hydrology and agronomy. The second parameter in interaction with the radar signal is the geometric structure of the soil. Most traditional electromagnetic models consider natural surfaces as single scale zero mean stationary Gaussian random processes. Roughness behavior is characterized by statistical parameters like the Root Mean Square (RMS) height and the correlation length. Then, the main problem is that the agreement between experimental measurements and theoretical values is usually poor due to the large variability of the correlation function, and as a consequence, backscattering models have often failed to predict correctly backscattering. In this study, surfaces are considered as band-limited fractal random processes corresponding to a superposition of a finite number of one-dimensional Gaussian process each one having a spatial scale. Multiscale roughness is characterized by two parameters, the first one is proportional to the RMS height, and the other one is related to the fractal dimension. Soil moisture is related to the complex dielectric constant. This multiscale description has been adapted to two-dimensional profiles using the bi-dimensional wavelet transform and the Mallat algorithm to describe more correctly natural surfaces. We characterize the soil surfaces and sub-surfaces by a three layers geo-electrical model. The upper layer is described by its dielectric constant, thickness, a multiscale bi-dimensional surface roughness model by using the wavelet transform and the Mallat algorithm, and volume scattering parameters. The lower layer is divided into three fictive layers separated by an assumed plane interface. These three layers were modeled by an effective medium characterized by an apparent effective dielectric constant taking into account the presence of air pockets in the soil. We have adopted the 2D multiscale three layers small perturbations model including, firstly air pockets in the soil sub-structure, and then a vegetable canopy in the soil surface structure, that is to simulate the radar backscattering. A sensitivity analysis of backscattering coefficient dependence on multiscale roughness and new soil moisture has been performed. Later, we proposed to change the dielectric constant of the multilayer medium because it takes into account the different moisture values of each layer in the soil. A sensitivity analysis of the backscattering coefficient, including the air pockets in the volume structure with respect to the multiscale roughness parameters and the apparent dielectric constant, was carried out. Finally, we proposed to study the behavior of the backscattering coefficient of the radar on a soil having a vegetable layer in its surface structure.

Keywords: multiscale, bidimensional, wavelets, backscattering, multilayer, SPM, air pockets

Procedia PDF Downloads 126
1755 A Survey on Lossless Compression of Bayer Color Filter Array Images

Authors: Alina Trifan, António J. R. Neves

Abstract:

Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.

Keywords: bayer image, CFA, lossless compression, image coding standards

Procedia PDF Downloads 322
1754 Alternative Seed System for Enhanced Availability of Quality Seeds and Seed/Varietal Replacement Rate - An Experience

Authors: Basave Gowda, Lokesh K., Prasanth S. M., Bellad S. B., Radha J., Lokesh G. Y., Patil S. B., Vijayakumar D. K., Ganigar B. S., Rakesh C. Mathad

Abstract:

Quality seed plays an important role in enhancing the crop productivity. It was reported and confirmed by large scale verification research trials that by use of quality seeds alone, the crop yield can be enhanced by 15 to 20 per cent. At present, the quality seed production and distribution through organised sectors comprising both public and private seed sector was only 20-25% of the requirement and the remaining quantity is met through unorganised sector which include the farmer to farmers saved seeds. With an objective of developing an alternative seed system, the University of Agricultural Sciences, Raichur in Karnataka state has implemented Seed Village Programme in more than 100 villages covering around 5000 farmers every year since 2009-10 and in the selected seed villages, a group of 50-150 farmers were supplied the foundation seeds of new varieties to an extent of 0.4 ha at 50 % subsidy. And two to three training programmes were conducted in the targeted villages for quality seed production and the seed produced in the target group was processed locally in the university seed processing units and arranged for distribution in the local villages by the seed growers themselves. By this new innovative and modified seed system, the university can able to replace old varieties of pigeon pea and green gram by producing 1482, 2978, 2729, 2560, and 4581 tonnes of seeds of new varieties on large scale under farmers and scientists participatory seed village programmes respectively during 2009-10, 2010-11, 2011-12, 2012-13 and 2013-14. From this new alternate model of seed system, there should be large scale promotion of regional seed system involving farmers, NGO and voluntary organisation for quick and effective replacement of old, low yielding, disease susceptible varieties with new high yielding, disease resistant for enhanced food production and food security.

Keywords: seed system, seed village, seed replacement, varietal replacement

Procedia PDF Downloads 434
1753 Pollution Associated with Combustion in Stove to Firewood (Eucalyptus) and Pellet (Radiate Pine): Effect of UVA Irradiation

Authors: Y. Vásquez, F. Reyes, P. Oyola, M. Rubio, J. Muñoz, E. Lissi

Abstract:

In several cities in Chile, there is significant urban pollution, particularly in Santiago and in cities in the south where biomass is used as fuel in heating and cooking in a large proportion of homes. This has generated interest in knowing what factors can be modulated to control the level of pollution. In this project was conditioned and set up a photochemical chamber (14m3) equipped with gas monitors e.g. CO, NOX, O3, others and PM monitors e.g. dustrack, DMPS, Harvard impactors, etc. This volume could be exposed to UVA lamps, producing a spectrum similar to that generated by the sun. In this chamber, PM and gas emissions associated with biomass burning were studied in the presence and absence of radiation. From the comparative analysis of wood stove (eucalyptus globulus) and pellet (radiata pine), it can be concluded that, in the first approximation, 9-nitroanthracene, 4-nitropyrene, levoglucosan, water soluble potassium and CO present characteristics of the tracers. However, some of them show properties that interfere with this possibility. For example, levoglucosan is decomposed by radiation. The 9-nitroanthracene, 4-nitropyrene are emitted and formed under radiation. The 9-nitroanthracene has a vapor pressure that involves a partition involving the gas phase and particulate matter. From this analysis, it can be concluded that K+ is compound that meets the properties known to be tracer. The PM2.5 emission measured in the automatic pellet stove that was used in this thesis project was two orders of magnitude smaller than that registered by the manual wood stove. This has led to encouraging the use of pellet stoves in indoor heating, particularly in south-central Chile. However, it should be considered, while the use of pellet is not without problems, due to pellet stove generate high concentrations of Nitro-HAP's (secondary organic contaminants). In particular, 4-nitropyrene, compound of high toxicity, also primary and secondary particulate matter, associated with pellet burning produce a decrease in the size distribution of the PM, which leads to a depth penetration of the particles and their toxic components in the respiratory system.

Keywords: biomass burning, photochemical chamber, particulate matter, tracers

Procedia PDF Downloads 194
1752 Determining Earthquake Performances of Existing Reinforced Concrete Buildings by Using ANN

Authors: Musa H. Arslan, Murat Ceylan, Tayfun Koyuncu

Abstract:

In this study, an artificial intelligence-based (ANN based) analytical method has been developed for analyzing earthquake performances of the reinforced concrete (RC) buildings. 66 RC buildings with four to ten storeys were subjected to performance analysis according to the parameters which are the existing material, loading and geometrical characteristics of the buildings. The selected parameters have been thought to be effective on the performance of RC buildings. In the performance analyses stage of the study, level of performance possible to be shown by these buildings in case of an earthquake was determined on the basis of the 4-grade performance levels specified in Turkish Earthquake Code- 2007 (TEC-2007). After obtaining the 4-grade performance level, selected 23 parameters of each building have been matched with the performance level. In this stage, ANN-based fast evaluation algorithm mentioned above made an economic and rapid evaluation of four to ten storey RC buildings. According to the study, the prediction accuracy of ANN has been found about 74%.

Keywords: artificial intelligence, earthquake, performance, reinforced concrete

Procedia PDF Downloads 463