Search results for: predicting models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7254

Search results for: predicting models

3354 WHSS: A Platform for Designing Water Harvesting Systems for Multiple Purposes

Authors: Ignacio Sanchez Cohen, Aurelio Pedroza Sandoval, Ricardo Trejo Calzada

Abstract:

Water harvesting systems (WHS) has become the unique alternative that farmers in dry areas accounts for surviving dry periods. Nevertheless, technicians, agronomists, and users, in general, have to cope with the difficulty of finding suitable technology for optimal design of WHS. In this paper, we describe a user-friendly computer program that uses readily available information for the design of multiple WHS depending upon the water final use (agriculture, household, conservation, etc). The application (APP) itself contains several links to help the user complete the input requirements. It is not a prerequisite to have any computer skills for the use of the APP. Outputs of the APP are the dimensions of the WHS named terraces, micro-catchments, cisterns, and small household cisterns for roof water catchment. The APP also provides guidance on crops for backyard agriculture. We believe that this tool may guide users to better optimize WHS for multiple purposes and to widen the possibility of copping with dry spells in arid lands.

Keywords: rainfall-catchment, models, computer aid, arid lands

Procedia PDF Downloads 160
3353 Scaling Strategy of a New Experimental Rig for Wheel-Rail Contact

Authors: Meysam Naeimi, Zili Li, Rolf Dollevoet

Abstract:

A new small–scale test rig developed for rolling contact fatigue (RCF) investigations in wheel–rail material. This paper presents the scaling strategy of the rig based on dimensional analysis and mechanical modelling. The new experimental rig is indeed a spinning frame structure with multiple wheel components over a fixed rail-track ring, capable of simulating continuous wheel-rail contact in a laboratory scale. This paper describes the dimensional design of the rig, to derive its overall scaling strategy and to determine the key elements’ specifications. Finite element (FE) modelling is used to simulate the mechanical behavior of the rig with two sample scale factors of 1/5 and 1/7. The results of FE models are compared with the actual railway system to observe the effectiveness of the chosen scales. The mechanical properties of the components and variables of the system are finally determined through the design process.

Keywords: new test rig, rolling contact fatigue, rail, small scale

Procedia PDF Downloads 456
3352 UBCSAND Model Calibration for Generic Liquefaction Triggering Curves

Authors: Jui-Ching Chou

Abstract:

Numerical simulation is a popular method used to evaluate the effects of soil liquefaction on a structure or the effectiveness of a mitigation plan. Many constitutive models (UBCSAND model, PM4 model, SANISAND model, etc.) were presented to model the liquefaction phenomenon. In general, inputs of a constitutive model need to be calibrated against the soil cyclic resistance before being applied to the numerical simulation model. Then, simulation results can be compared with results from simplified liquefaction potential assessing methods. In this article, inputs of the UBCSAND model, a simple elastic-plastic stress-strain model, are calibrated against several popular generic liquefaction triggering curves of simplified liquefaction potential assessing methods via FLAC program. Calibrated inputs can provide engineers to perform a preliminary evaluation of an existing structure or a new design project.

Keywords: calibration, liquefaction, numerical simulation, UBCSAND Model

Procedia PDF Downloads 142
3351 Novel Technique for calculating Surface Potential Gradient of Overhead Line Conductors

Authors: Sudip Sudhir Godbole

Abstract:

In transmission line surface potential gradient is a critical design parameter for planning overhead line, as it determines the level of corona loss (CL), radio interference (RI) and audible noise (AN).With increase of transmission line voltage level bulk power transfer is possible, using bundle conductor configuration used, it is more complex to find accurate surface stress in bundle configuration. The majority of existing models for surface gradient calculations are based on analytical methods which restrict their application in simulating complex surface geometry. This paper proposes a novel technique which utilizes both analytical and numerical procedure to predict the surface gradient. One of 400 kV transmission line configurations has been selected as an example to compare the results for different methods. The different strand shapes are a key variable in determining.

Keywords: surface gradient, Maxwell potential coefficient method, market and Mengele’s method, successive images method, charge simulation method, finite element method

Procedia PDF Downloads 519
3350 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines

Authors: Alexander Guzman Urbina, Atsushi Aoyama

Abstract:

The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.

Keywords: deep learning, risk assessment, neuro fuzzy, pipelines

Procedia PDF Downloads 278
3349 Investigation and Analysis of Vortex-Induced Vibrations in Sliding Gate Valves Using Computational Fluid Dynamics

Authors: Kianoosh Ahadi, Mustafa Ergil

Abstract:

In this study, the event of vibrations caused by vortexes and the distribution of induced hydrodynamic forces due to vortexes on the sliding gate valves has been investigated. For this reason, a sliding valve with the help of computational fluid dynamics (CFD) software was simulated in two-dimensional )2D(, where the flow and turbulence equations were solved for three different valve openings (full, half, and 16.7 %) models. The variety of vortexes formed within the vicinity of the valve structure was investigated based on time where the trend of fluctuations and their occurrence regions have been detected. From the gathered solution dataset of the numerical simulations, the pressure coefficient (CP), the lift force coefficient (CL), the drag force coefficient (CD), and the momentum coefficient due to hydrodynamic forces (CM) were examined, and relevant figures were generated were from these results, the vortex-induced vibrations were analyzed.

Keywords: induced vibrations, computational fluid dynamics, sliding gate valves, vortexes

Procedia PDF Downloads 92
3348 Towards Positive Identity Construction for Japanese Non-Native English Language Teachers

Authors: Yumi Okano

Abstract:

The low level of English proficiency among Japanese people has been a problem for a long time. Japanese non-native English language teachers, under social or ideological constraints, feel a gap between government policy and their language proficiency and cannot maintain high self-esteem. This paper focuses on current Japanese policies and the social context in which teachers are placed and examines the measures necessary for their positive identity formation from a macro-meso-micro perspective. Some suggestions for achieving this are: 1) Teachers should free themselves from the idea of native speakers and embrace local needs and accents, 2) Teachers should be involved in student discussions as facilitators and individuals so that they can be good role models for their students, and 3) Teachers should invest in their classrooms. 4) Guidelines and training should be provided to help teachers gain confidence. In addition to reducing the workload to make more time available, 5) expanding opportunities for investment outside the classroom into the real world is necessary.

Keywords: language teacher identity, native speakers, government policy, critical pedagogy, investment

Procedia PDF Downloads 85
3347 Modelling Medieval Vaults: Digital Simulation of the North Transept Vault of St Mary, Nantwich, England

Authors: N. Webb, A. Buchanan

Abstract:

Digital and virtual heritage is often associated with the recreation of lost artefacts and architecture; however, we can also investigate works that were not completed, using digital tools and techniques. Here we explore physical evidence of a fourteenth-century Gothic vault located in the north transept of St Mary’s church in Nantwich, Cheshire, using existing springer stones that are built into the walls as a starting point. Digital surveying tools are used to document the architecture, followed by an analysis process to hypothesise and simulate possible design solutions, had the vault been completed. A number of options, both two-dimensionally and three-dimensionally, are discussed based on comparison with examples of other contemporary vaults, thus adding another specimen to the corpus of vault designs. Dissemination methods such as digital models and 3D prints are also explored as possible resources for demonstrating what the finished vault might have looked like for heritage interpretation and other purposes.

Keywords: digital simulation, heritage interpretation, medieval vaults, virtual heritage, 3d scanning

Procedia PDF Downloads 319
3346 A Study on Automotive Attack Database and Data Flow Diagram for Concretization of HEAVENS: A Car Security Model

Authors: Se-Han Lee, Kwang-Woo Go, Gwang-Hyun Ahn, Hee-Sung Park, Cheol-Kyu Han, Jun-Bo Shim, Geun-Chul Kang, Hyun-Jung Lee

Abstract:

In recent years, with the advent of smart cars and the expansion of the market, the announcement of 'Adventures in Automotive Networks and Control Units' at the DEFCON21 conference in 2013 revealed that cars are not safe from hacking. As a result, the HEAVENS model considering not only the functional safety of the vehicle but also the security has been suggested. However, the HEAVENS model only presents a simple process, and there are no detailed procedures and activities for each process, making it difficult to apply it to the actual vehicle security vulnerability check. In this paper, we propose an automated attack database that systematically summarizes attack vectors, attack types, and vulnerable vehicle models to prepare for various car hacking attacks, and data flow diagrams that can detect various vulnerabilities and suggest a way to materialize the HEAVENS model.

Keywords: automotive security, HEAVENS, car hacking, security model, information security

Procedia PDF Downloads 333
3345 Model-Independent Price Bounds for the Swiss Re Mortality Bond 2003

Authors: Raj Kumari Bahl, Sotirios Sabanis

Abstract:

In this paper, we are concerned with the valuation of the first Catastrophic Mortality Bond that was launched in the market namely the Swiss Re Mortality Bond 2003. This bond encapsulates the behavior of a well-defined mortality index to generate payoffs for the bondholders. Pricing this bond is a challenging task. We adapt the payoff of the terminal principal of the bond in terms of the payoff of an Asian put option and present an approach to derive model-independent bounds exploiting comonotonic theory. We invoke Jensen’s inequality for the computation of lower bounds and employ Lagrange optimization technique to achieve the upper bound. The success of these bounds is based on the availability of compatible European mortality options in the market. We carry out Monte Carlo simulations to estimate the bond price and illustrate the strength of these bounds across a variety of models. The fact that our bounds are model-independent is a crucial breakthrough in the pricing of catastrophic mortality bonds.

Keywords: mortality bond, Swiss Re Bond, mortality index, comonotonicity

Procedia PDF Downloads 233
3344 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption

Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme

Procedia PDF Downloads 362
3343 Saturation Misbehavior and Field Activation of the Mobility in Polymer-Based OTFTs

Authors: L. Giraudet, O. Simonetti, G. de Tournadre, N. Dumelié, B. Clarenc, F. Reisdorffer

Abstract:

In this paper we intend to give a comprehensive view of the saturation misbehavior of thin film transistors (TFTs) based on disordered semiconductors, such as most organic TFTs, and its link to the field activation of the mobility. Experimental evidence of the field activation of the mobility is given for disordered semiconductor based TFTs, when reducing the gate length. Saturation misbehavior is observed simultaneously. Advanced transport models have been implemented in a quasi-2D numerical TFT simulation software. From the numerical simulations it is clearly established that field activation of the mobility alone cannot explain the saturation misbehavior. Evidence is given that high longitudinal field gradient at the drain end of the channel is responsible for an excess charge accumulation, preventing saturation. The two combined effects allow reproducing the experimental output characteristics of short channel TFTs, with S-shaped characteristics and saturation failure.

Keywords: mobility field activation, numerical simulation, OTFT, saturation failure

Procedia PDF Downloads 499
3342 Links between Landscape Management and Environmental Risk Assessment: Considerations from the Italian Context

Authors: Mara Balestrieri, Clara Pusceddu

Abstract:

Issues relating to the destructive phenomena that can damage people and goods have returned to the centre of debate in Italy with the increase in catastrophic episodes in recent years in a country which is highly vulnerable to hydrological risk. Environmental factors and geological and geomorphological territorial characteristics play an important role in determining the level of vulnerability and the natural tendency to risk. However, a territory has also been subjected to the requirements of and transformations of society, and this brings other relevant factors. The reasons for the increase in destructive phenomena are often to be found in the territorial development models adopted. Stewardship of the landscape and management of risk are related issues. This study aims to summarize the most relevant elements about this connection and at the same time to clarify the role of environmental risk assessment as a tool to aid in the sustainable management of landscape. How planners relate to this problem and which aspects should be monitored in order to prepare responsible and useful interventions?

Keywords: assessment, landscape, risk, planning

Procedia PDF Downloads 443
3341 Removal of Cr⁶⁺, Co²⁺ and Ni²⁺ Ions from Aqueous Solutions by Algerian Enteromorpha compressa (L.) Biomass

Authors: Asma Aid, Samira Amokrane, Djamel Nibou, Hadj Mekatel

Abstract:

The marine Enteromorpha Compressa (L.) (ECL) biomass was used as a low-cost biological adsorbent for the removal of Cr⁶⁺, Co²⁺ and Ni²⁺ ions from artificially contaminated aqueous solutions. The operating variables pH, the initial concentration C₀, the solid/liquid ratio R and the temperature T were studied. A full factorial experimental design technique enabled us to obtain a mathematical model describing the adsorption of Cr⁶⁺, Co²⁺ and Ni²⁺ ions and to study the main effects and interactions among operational parameters. The equilibrium isotherm has been analyzed by Langmuir, Freundlich, and Dubinin-Radushkevich models; it has been found that the adsorption process follows the Langmuir model for the used ions. Kinetic studies showed that the pseudo-second-order model correlates our experimental data. Thermodynamic parameters showed the endothermic heat of adsorption and the spontaneity of the adsorption process for Cr⁶⁺ ions and exothermic heat of adsorption for Co²⁺ and Ni²⁺ ions.

Keywords: enteromorpha Compressa, adsorption process, Cr⁶⁺, Co²⁺ and Ni²⁺, equilibrium isotherm

Procedia PDF Downloads 177
3340 Generating Swarm Satellite Data using LSTM and GAN for the Detection of Seismic Precursors

Authors: Yaxin Bi

Abstract:

Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.

Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors

Procedia PDF Downloads 7
3339 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 57
3338 Mapping Alternative Education in Italy: The Case of Popular and Second-Chance Schools and Interventions in Lombardy

Authors: Valeria Cotza

Abstract:

School drop-out is a multifactorial phenomenon that in Italy concerns all those underage students who, at different school stages (up to 16 years old) or training (up to 18 years old), manifest educational difficulties from dropping out of compulsory education without obtaining a qualification to repetition rates and absenteeism. From the 1980s to the 2000s, there was a progressive attenuation of the economic and social model towards a multifactorial reading of the phenomenon, and the European Commission noted the importance of learning about the phenomenon through approaches able to integrate large-scale quantitative surveys with qualitative analyses. It is not a matter of identifying the contextual factors affecting the phenomenon but problematising them by means of systemic and comprehensive in-depth analysis. So, a privileged point of observation and field of intervention are those schools that propose alternative models of teaching and learning to the traditional ones, such as popular and second-chance schools. Alternative schools and interventions grew in these years in Europe as well as in the US and Latin America, working in the direction of greater equity to create the conditions (often absent in conventional schools) for everyone to achieve educational goals. Against extensive Anglo-Saxon and US literature on this topic, there is yet no unambiguous definition of alternative education, especially in Europe, where second-chance education has been most studied. There is little literature on a second chance in Italy and almost none on alternative education (with the exception of method schools, to which in Italy the concept of “alternative” is linked). This research aims to fill the gap by systematically surveying the alternative interventions in the area and beginning to explore some models of popular and second-chance schools and experiences through a mixed methods approach. So, the main research objectives concern the spread of alternative education in the Lombardy region, the main characteristics of these schools and interventions, and their effectiveness in terms of students’ well-being and school results. This paper seeks to answer the first point by presenting the preliminary results of the first phase of the project dedicated to mapping. Through the Google Forms platform, a questionnaire is being distributed to all schools in Lombardy and some schools in the rest of Italy to map the presence of alternative schools and interventions and their main characteristics. The distribution is also taking place thanks to the support of the Milan Territorial and Lombardy Regional School Offices. Moreover, other social realities outside the school system (such as cooperatives and cultural associations) can be questioned. The schools and other realities to be questioned outside Lombardy will also be identified with the support of INDIRE (Istituto Nazionale per Documentazione, Innovazione e Ricerca Educativa, “National Institute for Documentation, Innovation and Educational Research”) and based on existing literature and the indicators of “Futura” Plan of the PNRR (Piano Nazionale di Ripresa e Resilienza, “National Recovery and Resilience Plan”). Mapping will be crucial and functional for the subsequent qualitative and quantitative phase, which will make use of statistical analysis and constructivist grounded theory.

Keywords: school drop-out, alternative education, popular and second-chance schools, map

Procedia PDF Downloads 62
3337 Development of Microwave-Assisted Alkalic Salt Pretreatment Regimes for Enhanced Sugar Recovery from Corn Cobs

Authors: Yeshona Sewsynker

Abstract:

This study presents three microwave-assisted alkalic salt pretreatments to enhance delignification and enzymatic saccharification of corn cobs. The effects of process parameters of salt concentration (0-15%), microwave power intensity (0-800 W) and pretreatment time (2-8 min) on reducing sugar yield from corn cobs were investigated. Pretreatment models were developed with the high coefficient of determination values (R2>0.85). Optimization gave a maximum reducing sugar yield of 0.76 g/g. Scanning electron microscopy (SEM) and Fourier Transform Infrared analysis (FTIR) showed major changes in the lignocellulosic structure after pretreatment. A 7-fold increase in the sugar yield was observed compared to previous reports on the same substrate. The developed pretreatment strategy was effective for enhancing enzymatic saccharification from lignocellulosic wastes for microbial biofuel production processes and value-added products.

Keywords: pretreatment, lignocellulosic biomass, enzymatic hydrolysis, delignification

Procedia PDF Downloads 483
3336 The Delaying Influence of Degradation on the Divestment of Gas Turbines for Associated Gas Utilisation: Part 1

Authors: Mafel Obhuo, Dodeye I. Igbong, Duabari S. Aziaka, Pericles Pilidis

Abstract:

An important feature of the exploitation of associated gas as fuel for gas turbine engines is a declining supply. So when exploiting this resource, the divestment of prime movers is very important as the fuel supply diminishes with time. This paper explores the influence of engine degradation on the timing of divestments. Hypothetical but realistic gas turbine engines were modelled with Turbomatch, the Cranfield University gas turbine performance simulation tool. The results were deployed in three degradation scenarios within the TERA (Techno-economic and environmental risk analysis) framework to develop economic models. An optimisation with Genetic Algorithms was carried out to maximize the economic benefit. The results show that degradation will have a significant impact. It will delay the divestment of power plants, while they are running less efficiently. Over a 20 year investment, a decrease of $0.11bn, $0.26bn and $0.45bn (billion US dollars) were observed for the three degradation scenarios as against the clean case.

Keywords: economic return, flared associated gas, net present value, optimization

Procedia PDF Downloads 119
3335 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques

Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari

Abstract:

Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.

Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding

Procedia PDF Downloads 135
3334 Multiscale Modelization of Multilayered Bi-Dimensional Soils

Authors: I. Hosni, L. Bennaceur Farah, N. Saber, R Bennaceur

Abstract:

Soil moisture content is a key variable in many environmental sciences. Even though it represents a small proportion of the liquid freshwater on Earth, it modulates interactions between the land surface and the atmosphere, thereby influencing climate and weather. Accurate modeling of the above processes depends on the ability to provide a proper spatial characterization of soil moisture. The measurement of soil moisture content allows assessment of soil water resources in the field of hydrology and agronomy. The second parameter in interaction with the radar signal is the geometric structure of the soil. Most traditional electromagnetic models consider natural surfaces as single scale zero mean stationary Gaussian random processes. Roughness behavior is characterized by statistical parameters like the Root Mean Square (RMS) height and the correlation length. Then, the main problem is that the agreement between experimental measurements and theoretical values is usually poor due to the large variability of the correlation function, and as a consequence, backscattering models have often failed to predict correctly backscattering. In this study, surfaces are considered as band-limited fractal random processes corresponding to a superposition of a finite number of one-dimensional Gaussian process each one having a spatial scale. Multiscale roughness is characterized by two parameters, the first one is proportional to the RMS height, and the other one is related to the fractal dimension. Soil moisture is related to the complex dielectric constant. This multiscale description has been adapted to two-dimensional profiles using the bi-dimensional wavelet transform and the Mallat algorithm to describe more correctly natural surfaces. We characterize the soil surfaces and sub-surfaces by a three layers geo-electrical model. The upper layer is described by its dielectric constant, thickness, a multiscale bi-dimensional surface roughness model by using the wavelet transform and the Mallat algorithm, and volume scattering parameters. The lower layer is divided into three fictive layers separated by an assumed plane interface. These three layers were modeled by an effective medium characterized by an apparent effective dielectric constant taking into account the presence of air pockets in the soil. We have adopted the 2D multiscale three layers small perturbations model including, firstly air pockets in the soil sub-structure, and then a vegetable canopy in the soil surface structure, that is to simulate the radar backscattering. A sensitivity analysis of backscattering coefficient dependence on multiscale roughness and new soil moisture has been performed. Later, we proposed to change the dielectric constant of the multilayer medium because it takes into account the different moisture values of each layer in the soil. A sensitivity analysis of the backscattering coefficient, including the air pockets in the volume structure with respect to the multiscale roughness parameters and the apparent dielectric constant, was carried out. Finally, we proposed to study the behavior of the backscattering coefficient of the radar on a soil having a vegetable layer in its surface structure.

Keywords: multiscale, bidimensional, wavelets, backscattering, multilayer, SPM, air pockets

Procedia PDF Downloads 106
3333 Development of a Miniature Laboratory Lactic Goat Cheese Model to Study the Expression of Spoilage by Pseudomonas Spp. In Cheeses

Authors: Abirami Baleswaran, Christel Couderc, Loubnah Belahcen, Jean Dayde, Hélène Tormo, Gwénaëlle Jard

Abstract:

Cheeses are often reported to be spoiled by Pseudomonas spp., responsible for defects in appearance, texture, taste, and smell, leading to their non-marketing and even their destruction. Despite preventive actions, problems linked to Pseudomonas spp. are difficult to control by the lack of knowledge and control of these contaminants during the cheese manufacturing. Lactic goat cheese producers are not spared by this problem and are looking for solutions to decrease the number of spoiled cheeses. To explore different hypotheses, experiments are needed. However, cheese-making experiments at the pilot scale are expensive and time consuming. Thus, there is a real need to develop a miniature cheeses model system under controlled conditions. In a previous study, several miniature cheese models corresponding to different type of commercial cheeses have been developed for different purposes. The models were, for example, used to study the influence of milk, starters cultures, pathogen inhibiting additives, enzymatic reactions, microflora, freezing process on cheese. Nevertheless, no miniature model was described on the lactic goat cheese. The aim of this work was to develop a miniature cheese model system under controlled laboratory conditions which resembles commercial lactic goat cheese to study Pseudomonas spp. spoilage during the manufacturing and ripening process. First, a protocol for the preparation of miniature cheeses (3.5 times smaller than a commercial one) was designed based on the cheese factorymanufacturing process. The process was adapted from “Rocamadour” technology and involves maturation of pasteurized milk, coagulation, removal of whey by centrifugation, moulding, and ripening in a little scale cellar. Microbiological (total bacterial count, yeast, molds) and physicochemical (pH, saltinmoisture, moisture in fat-free)analyses were performed on four key stages of the process (before salting, after salting, 1st day of ripening, and end of ripening). Factory and miniature cheeses volatilomewere also obtained after full scan Sift-MS cheese analysis. Then, Pseudomonas spp. strains isolated from contaminated cheeses were selected on their origin, their ability to produce pigments, and their enzymatic activities (proteolytic, lecithinasic, and lipolytic). Factory and miniature curds were inoculated by spotting selected strains on the cheese surface. The expression of cheese spoilage was evaluated by counting the level of Pseudomonas spp. during the ripening and by visual observation and under UVlamp. The physicochemical and microbiological compositions of miniature cheeses permitted to assess that miniature process resembles factory process. As expected, differences involatilomes were observed, probably due to the fact that miniature cheeses are made usingpasteurized milk to better control the microbiological conditions and also because the little format of cheese induced probably a difference during the ripening even if the humidity and temperature in the cellar were quite similar. The spoilage expression of Pseudomonas spp. was observed in miniature and factory cheeses. It confirms that the proposed model is suitable for the preparation of miniature cheese specimens in the spoilage study of Pseudomonas spp. in lactic cheeses. This kind of model could be deployed for other applications and other type of cheese.

Keywords: cheese, miniature, model, pseudomonas spp, spoilage

Procedia PDF Downloads 120
3332 Bioengineering System for Prediction and Early Prenosological Diagnostics of Stomach Diseases Based on Energy Characteristics of Bioactive Points with Fuzzy Logic

Authors: Mahdi Alshamasin, Riad Al-Kasasbeh, Nikolay Korenevskiy

Abstract:

We apply mathematical models for the interaction of the internal and biologically active points of meridian structures. Amongst the diseases for which reflex diagnostics are effective are those of the stomach disease. It is shown that use of fuzzy logic decision-making yields good results for the prediction and early diagnosis of gastrointestinal tract diseases, depending on the reaction energy of biologically active points (acupuncture points). It is shown that good results for the prediction and early diagnosis of diseases from the reaction energy of biologically active points (acupuncture points) are obtained by using fuzzy logic decision-making.

Keywords: acupuncture points, fuzzy logic, diagnostically important points (DIP), confidence factors, membership functions, stomach diseases

Procedia PDF Downloads 445
3331 Fairness in Recommendations Ranking: From Pairwise Approach to Listwise Approach

Authors: Patik Joslin Kenfack, Polyakov Vladimir Mikhailovich

Abstract:

Machine Learning (ML) systems are trained using human generated data that could be biased by implicitly containing racist, sexist, or discriminating data. ML models learn those biases or even amplify them. Recent research in work on has begun to consider issues of fairness. The concept of fairness is extended to recommendation. A recommender system will be considered fair if it doesn’t under rank items of protected group (gender, race, demographic...). Several metrics for evaluating fairness concerns in recommendation systems have been proposed, which take pairs of items as ‘instances’ in fairness evaluation. It doesn’t take in account the fact that the fairness should be evaluated across a list of items. The paper explores a probabilistic approach that generalize pairwise metric by using a list k (listwise) of items as ‘instances’ in fairness evaluation, parametrized by k. We also explore new regularization method based on this metric to improve fairness ranking during model training.

Keywords: Fairness, Recommender System, Ranking, Listwise Approach

Procedia PDF Downloads 126
3330 Implementation and Design of Fuzzy Controller for High Performance Dc-Dc Boost Converters

Authors: A. Mansouri, F. Krim

Abstract:

This paper discusses the implementation and design of both linear PI and fuzzy controllers for DC-DC boost converters. Design of PI controllers is based on temporal response of closed-loop converters, while fuzzy controllers design is based on heuristic knowledge of boost converters. Linear controller implementation is quite straightforward relying on mathematical models, while fuzzy controller implementation employs one or more artificial intelligences techniques. Comparison between these boost controllers is made in design aspect. Experimental results show that the proposed fuzzy controller system is robust against input voltage and load resistance changing and in respect of start-up transient. Results indicate that fuzzy controller can achieve best control performance concerning faster transient response, steady-state response good stability and accuracy under different operating conditions. Fuzzy controller is more suitable to control boost converters.

Keywords: boost DC-DC converter, fuzzy, PI controllers, power electronics and control system

Procedia PDF Downloads 450
3329 Local Activities of the Membranes Associated with Glycosaminoglycan-Chitosan Complexes in Bone Cells

Authors: Chih-Chang Yeh, Min-Fang Yang, Hsin-I Chang

Abstract:

Chitosan is a cationic polysaccharide derived from the partial deacetylation of chitin. Hyaluronic acid (HA), chondroitin sulfate (CS) and heparin (HP) are anionic glycosaminoglycans (GCGs) which can regulate osteogenic activity. In this study, chitosan membranes were prepared by glutaraldehyde crosslinking reaction and then complexed with three different types of GCGs. 7F2 osteoblasts-like cells and macrophages Raw264.7 were used as models to study the influence of chitosan membranes on osteometabolism. Although chitosan membranes are highly hydrophilic, the membranes associated with GCG-chitosan complexes showed about 60-70% cell attachment. Furthermore, the membranes associated with HP-chitosan complexes could increase ALP activity in comparison with chitosan films only. Three types of the membranes associated with GCG-chitosan complexes could significantly inhibit LPS induced-nitric oxide expression. In addition, chitosan membranes associated with HP and HA can down-regulate tartrate-resistant acid phosphatase (TRAP) activity but not CS-chitosan complexes. Based on these results, we conclude that chitosan membranes associated with HP can increase ALP activity in osteoblasts and chitosan membranes associated with HP and HA reduce TRAP activity in osteoclasts.

Keywords: osteoblast, osteoclast, chitosan, glycosaminoglycan

Procedia PDF Downloads 504
3328 Exploring Probabilistic Models for Transient Stability Analysis of Renewable-Dominant Power Grid

Authors: Phuong Nguyen

Abstract:

Along with the ongoing energy transition, the electrical power system is getting more vulnerable with the increasing penetration of renewable energy sources (RES). By replacing a large amount of fossil fuel-based power plants with RES, the rotating mass of the power grid is decreasing drastically, which has been reported by a number of system operators. This leads to a huge challenge for operators to secure the operation of their grids in all-time horizon ranges, from sub-seconds to minutes and even hours. There is a need to revise the grid capabilities in dealing with transient (angle) stability and voltage dynamics. While the traditional approaches relied on deterministic scenarios (worst-case scenarios), there is also a need to cover a whole range of probabilities regarding a wide range of uncertainties coming from massive RES units. To contribute to handle these issues, this paper aims to focus on developing a new analytical approach for transient stability.

Keywords: transient stability, uncertainties, renewable energy sources, analytical approach

Procedia PDF Downloads 54
3327 The Role of Communicative Grammar in Cross-Cultural Learning Environment

Authors: Tonoyan Lusine

Abstract:

The Communicative Grammar (CG) of a language deals with semantics and pragmatics in the first place as communication is a process of generating speech. As it is well known people can communicate with the help of limited word expressions and grammatical means. As to non-verbal communication, both vocabulary and grammar are not essential at all. However, the development of the communicative competence lies in verbal, non-verbal, grammatical, socio-cultural and intercultural awareness. There are several important issues and environment management strategies related to effective communication that one might need to consider for a positive learning experience. International students bring a broad range of cultural perspectives to the learning environment, and this diversity has the capacity to improve interaction and to enrich the teaching/learning process. Intercultural setting implies creative and thought-provoking work with different cultural worldviews and international perspectives. It is worth mentioning that the use of Communicative Grammar models creates a profound background for the effective intercultural communication.

Keywords: CG, cross-cultural communication, intercultural awareness, non-verbal behavior

Procedia PDF Downloads 370
3326 Comparative Analysis of DTC Based Switched Reluctance Motor Drive Using Torque Equation and FEA Models

Authors: P. Srinivas, P. V. N. Prasad

Abstract:

Since torque ripple is the main cause of noise and vibrations, the performance of Switched Reluctance Motor (SRM) can be improved by minimizing its torque ripple using a novel control technique called Direct Torque Control (DTC). In DTC technique, torque is controlled directly through control of magnitude of the flux and change in speed of the stator flux vector. The flux and torque are maintained within set hysteresis bands. The DTC of SRM is analysed by two methods. In one of the methods, the actual torque is computed by conducting Finite Element Analysis (FEA) on the design specifications of the motor. In the other method, the torque is computed by Simplified Torque Equation. The variation of peak current, average current, torque ripple and speed settling time with Simplified Torque Equation model is compared with FEA based model.

Keywords: direct toque control, simplified torque equation, finite element analysis, torque ripple

Procedia PDF Downloads 460
3325 A Parametric Study on the Backwater Level Due to a Bridge Constriction

Authors: S. Atabay, T. A. Ali, Md. M. Mortula

Abstract:

This paper presents the results and findings from a parametric study on the water surface elevation at upstream of bridge constriction for subcritical flow. In this study, the influence of Manning's Roughness Coefficient of main channel (nmc) and of floodplain (nfp), and bridge opening (b) flow rate (Q), contraction (kcon), and expansion coefficients (kexp) were investigated on backwater level. The DECK bridge models with different span widths and without any pier were investigated within the two stage channel having various roughness conditions. One of the most commonly used commercial one-dimensional HEC-RAS model was used in this parametric study. This study showed that the effects of main channel roughness (nmc) and flow rate (Q) on the backwater level are much higher than those of the floodplain roughness (nfp). Bridge opening (b) with contraction (kcon) and expansion coefficients (kexp) have very little effect on the backwater level within this range of parameters.

Keywords: bridge backwater, parametric study, waterways, HEC-RAS model

Procedia PDF Downloads 287