Search results for: feature selection methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17833

Search results for: feature selection methods

15463 The Usage of Negative Emotive Words in Twitter

Authors: Martina Katalin Szabó, István Üveges

Abstract:

In this paper, the usage of negative emotive words is examined on the basis of a large Hungarian twitter-database via NLP methods. The data is analysed from a gender point of view, as well as changes in language usage over time. The term negative emotive word refers to those words that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g. rohadt jó ’damn good’) or a sentiment expression with positive polarity despite their negative prior polarity (e.g. brutális, ahogy ez a férfi rajzol ’it’s awesome (lit. brutal) how this guy draws’. Based on the findings of several authors, the same phenomenon can be found in other languages, so it is probably a language-independent feature. For the recent analysis, 67783 tweets were collected: 37818 tweets (19580 tweets written by females and 18238 tweets written by males) in 2016 and 48344 (18379 tweets written by females and 29965 tweets written by males) in 2021. The goal of the research was to make up two datasets comparable from the viewpoint of semantic changes, as well as from gender specificities. An exhaustive lexicon of Hungarian negative emotive intensifiers was also compiled (containing 214 words). After basic preprocessing steps, tweets were processed by ‘magyarlanc’, a toolkit is written in JAVA for the linguistic processing of Hungarian texts. Then, the frequency and collocation features of all these words in our corpus were automatically analyzed (via the analysis of parts-of-speech and sentiment values of the co-occurring words). Finally, the results of all four subcorpora were compared. Here some of the main outcomes of our analyses are provided: There are almost four times fewer cases in the male corpus compared to the female corpus when the negative emotive intensifier modified a negative polarity word in the tweet (e.g., damn bad). At the same time, male authors used these intensifiers more frequently, modifying a positive polarity or a neutral word (e.g., damn good and damn big). Results also pointed out that, in contrast to female authors, male authors used these words much more frequently as a positive polarity word as well (e.g., brutális, ahogy ez a férfi rajzol ’it’s awesome (lit. brutal) how this guy draws’). We also observed that male authors use significantly fewer types of emotive intensifiers than female authors, and the frequency proportion of the words is more balanced in the female corpus. As for changes in language usage over time, some notable differences in the frequency and collocation features of the words examined were identified: some of the words collocate with more positive words in the 2nd subcorpora than in the 1st, which points to the semantic change of these words over time.

Keywords: gender differences, negative emotive words, semantic changes over time, twitter

Procedia PDF Downloads 198
15462 Transient Free Laminar Convection in the Vicinity of a Thermal Conductive Vertical Plate

Authors: Anna Bykalyuk, Frédéric Kuznik, Kévyn Johannes

Abstract:

In this paper, the influence of a vertical plate’s thermal capacity is numerically investigated in order to evaluate the evolution of the thermal boundary layer structure, as well as the convective heat transfer coefficient and the velocity and temperature profiles. Whereas the heat flux of the heated vertical plate is evaluated under time depending boundary conditions. The main important feature of this problem is the unsteadiness of the physical phenomena. A 2D CFD model is developed with the Ansys Fluent 14.0 environment and is validated using unsteady data obtained for plasterboard studied under a dynamic temperature evolution. All the phenomena produced in the vicinity of the thermal conductive vertical plate (plasterboard) are analyzed and discussed. This work is the first stage of a holistic research on transient free convection that aims, in the future, to study the natural convection in the vicinity of a vertical plate containing Phase Change Materials (PCM).

Keywords: CFD modeling, natural convection, thermal conductive plate, time-depending boundary conditions

Procedia PDF Downloads 271
15461 Design Approach to Incorporate Unique Performance Characteristics of Special Concrete

Authors: Devendra Kumar Pandey, Debabrata Chakraborty

Abstract:

The advancement in various concrete ingredients like plasticizers, additives and fibers, etc. has enabled concrete technologists to develop many viable varieties of special concretes in recent decades. Such various varieties of concrete have significant enhancement in green as well as hardened properties of concrete. A prudent selection of appropriate type of concrete can resolve many design and application issues in construction projects. This paper focuses on usage of self-compacting concrete, high early strength concrete, structural lightweight concrete, fiber reinforced concrete, high performance concrete and ultra-high strength concrete in the structures. The modified properties of strength at various ages, flowability, porosity, equilibrium density, flexural strength, elasticity, permeability etc. need to be carefully studied and incorporated into the design of the structures. The paper demonstrates various mixture combinations and the concrete properties that can be leveraged. The selection of such products based on the end use of structures has been proposed in order to efficiently utilize the modified characteristics of these concrete varieties. The study involves mapping the characteristics with benefits and savings for the structure from design perspective. Self-compacting concrete in the structure is characterized by high shuttering loads, better finish, and feasibility of closer reinforcement spacing. The structural design procedures can be modified to specify higher formwork strength, height of vertical members, cover reduction and increased ductility. The transverse reinforcement can be spaced at closer intervals compared to regular structural concrete. It allows structural lightweight concrete structures to be designed for reduced dead load, increased insulation properties. Member dimensions and steel requirement can be reduced proportionate to about 25 to 35 percent reduction in the dead load due to self-weight of concrete. Steel fiber reinforced concrete can be used to design grade slabs without primary reinforcement because of 70 to 100 percent higher tensile strength. The design procedures incorporate reduction in thickness and joint spacing. High performance concrete employs increase in the life of the structures by improvement in paste characteristics and durability by incorporating supplementary cementitious materials. Often, these are also designed for slower heat generation in the initial phase of hydration. The structural designer can incorporate the slow development of strength in the design and specify 56 or 90 days strength requirement. For designing high rise building structures, creep and elasticity properties of such concrete also need to be considered. Lastly, certain structures require a performance under loading conditions much earlier than final maturity of concrete. High early strength concrete has been designed to cater to a variety of usages at various ages as early as 8 to 12 hours. Therefore, an understanding of concrete performance specifications for special concrete is a definite door towards a superior structural design approach.

Keywords: high performance concrete, special concrete, structural design, structural lightweight concrete

Procedia PDF Downloads 301
15460 A Literature Review on the Role of Local Potential for Creative Industries

Authors: Maya Irjayanti

Abstract:

Local creativity utilization has been a strategic investment to be expanded as a creative industry due to its significant contribution to the national gross domestic product. Many developed and developing countries look toward creative industries as an agenda for the economic growth. This study aims to identify the role of local potential for creative industries from various empirical studies. The method performed in this study will involve a peer-reviewed journal articles and conference papers review addressing local potential and creative industries. The literature review analysis will include several steps: material collection, descriptive analysis, category selection, and material evaluation. Finally, the outcome expected provides a creative industries clustering based on the local potential of various nations. In addition, the finding of this study will be used as future research reference to explore a particular area with well-known aspects of local potential for creative industry products.

Keywords: business, creativity, local potential, local wisdom

Procedia PDF Downloads 375
15459 Design, Simulation and Fabrication of Electro-Magnetic Pulse Welding Coil and Initial Experimentation

Authors: Bharatkumar Doshi

Abstract:

Electro-Magnetic Pulse Welding (EMPW) is a solid state welding process carried out at almost room temperature, in which joining is enabled by high impact velocity deformation. In this process, high voltage capacitor’s stored energy is discharged in an EM coil resulting in a damped, sinusoidal current with an amplitude of several hundred kiloamperes. Due to these transient magnetic fields of few tens of Tesla near the coil is generated. As the conductive (tube) part is positioned in this area, an opposing eddy current is induced in this part. Consequently, high Lorentz forces act on the part, leading to acceleration away from the coil. In case of a tube, it gets compressed under forming velocities of more than 300 meters per second. After passing the joining gap it collides with the second metallic joining rod, leading to the formation of a jet under appropriate collision conditions. Due to the prevailing high pressure, metallurgical bonding takes place. A characteristic feature is the wavy interface resulting from the heavy plastic deformations. In the process, the formation of intermetallic compounds which might deteriorate the weld strength can be avoided, even for metals with dissimilar thermal properties. In order to optimize the process parameters like current, voltage, inductance, coil dimensions, workpiece dimensions, air gap, impact velocity, effective plastic strain, shear stress acting in the welding zone/impact zone etc. are very critical and important to establish. These process parameters could be determined by simulation using Finite Element Methods (FEM) in which electromagnetic –structural couple field analysis is performed. The feasibility of welding could thus be investigated by varying the parameters in the simulation using COMSOL. Simulation results shall be applied in performing the preliminary experiments of welding the different alloy steel tubes and/or alloy steel to other materials. The single turn coil (S.S.304) with field shaper (copper) has been designed and manufactured. The preliminary experiments are performed using existing EMPW facility available Institute for Plasma Research, Gandhinagar, India. The experiments are performed at 22kV charged into 64µF capacitor bank and the energy is discharged into single turn EM coil. Welding of axi-symetric components such as aluminum tube and rod has been proven experimentally using EMPW techniques. In this paper EM coil design, manufacturing, Electromagnetic-structural FEM simulation of Magnetic Pulse Welding and preliminary experiment results is reported.

Keywords: COMSOL, EMPW, FEM, Lorentz force

Procedia PDF Downloads 174
15458 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor

Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric

Abstract:

Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.

Keywords: car-detector, HOG, motion, computing time

Procedia PDF Downloads 319
15457 The Effect of the Acquisition and Reconstruction Parameters in Quality of Spect Tomographic Images with Attenuation and Scatter Correction

Authors: N. Boutaghane, F. Z. Tounsi

Abstract:

Many physical and technological factors degrade the SPECT images, both qualitatively and quantitatively. For this, it is not always put into leading technological advances to improve the performance of tomographic gamma camera in terms of detection, collimation, reconstruction and correction of tomographic images methods. We have to master firstly the choice of various acquisition and reconstruction parameters, accessible to clinical cases and using the attenuation and scatter correction methods to always optimize quality image and minimized to the maximum dose received by the patient. In this work, an evaluation of qualitative and quantitative tomographic images is performed based on the acquisition parameters (counts per projection) and reconstruction parameters (filter type, associated cutoff frequency). In addition, methods for correcting physical effects such as attenuation and scatter degrading the image quality and preventing precise quantitative of the reconstructed slices are also presented. Two approaches of attenuation and scatter correction are implemented: the attenuation correction by CHANG method with a filtered back projection reconstruction algorithm and scatter correction by the subtraction JASZCZAK method. Our results are considered as such recommandation, which permits to determine the origin of the different artifacts observed both in quality control tests and in clinical images.

Keywords: attenuation, scatter, reconstruction filter, image quality, acquisition and reconstruction parameters, SPECT

Procedia PDF Downloads 438
15456 Aerodynamic Design an UAV and Stability Analysis with Method of Genetic Algorithm Optimization

Authors: Saul A. Torres Z., Eduardo Liceaga C., Alfredo Arias M.

Abstract:

We seek to develop a UAV for agricultural spraying at a maximum altitude of 5000 meters above sea level, with a payload of 100 liters of fumigant. For the developing the aerodynamic design of the aircraft is using computational tools such as the "Vortex Lattice Athena" software, "MATLAB", "ANSYS FLUENT", "XFoil" package among others. Also methods are being used structured programming, exhaustive analysis of optimization methods and search. The results have a very low margin of error, and the multi-objective problems can be helpful for future developments. Also we developed method for Stability Analysis (Lateral-Directional and Longitudinal).

Keywords: aerodynamics design, optimization, algorithm genetic, multi-objective problem, longitudinal stability, lateral-directional stability

Procedia PDF Downloads 583
15455 Evaluating the Performance of Color Constancy Algorithm

Authors: Damanjit Kaur, Avani Bhatia

Abstract:

Color constancy is significant for human vision since color is a pictorial cue that helps in solving different visions tasks such as tracking, object recognition, or categorization. Therefore, several computational methods have tried to simulate human color constancy abilities to stabilize machine color representations. Two different kinds of methods have been used, i.e., normalization and constancy. While color normalization creates a new representation of the image by canceling illuminant effects, color constancy directly estimates the color of the illuminant in order to map the image colors to a canonical version. Color constancy is the capability to determine colors of objects independent of the color of the light source. This research work studies the most of the well-known color constancy algorithms like white point and gray world.

Keywords: color constancy, gray world, white patch, modified white patch

Procedia PDF Downloads 311
15454 Field Scale Simulation Study of Miscible Water Alternating CO2 Injection Process in Fractured Reservoirs

Authors: Hooman Fallah

Abstract:

Vast amounts of world oil reservoirs are in natural fractured reservoirs. There are different methods for increasing recovery from fractured reservoirs. Miscible injection of water alternating CO2 is a good choice among this methods. In this method, water and CO2 slugs are injected alternatively in reservoir as miscible agent into reservoir. This paper studies water injection scenario and miscible injection of water and CO2 in a two dimensional, inhomogeneous fractured reservoir. The results show that miscible water alternating CO2¬ gas injection leads to 3.95% increase in final oil recovery and total water production decrease of 3.89% comparing to water injection scenario.

Keywords: simulation study, CO2, water alternating gas injection, fractured reservoirs

Procedia PDF Downloads 285
15453 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument

Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki

Abstract:

According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.

Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test

Procedia PDF Downloads 297
15452 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies

Authors: Roberta Martino, Viviana Ventre

Abstract:

Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.

Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty

Procedia PDF Downloads 124
15451 Evaluation of Competency Training Effectiveness in Chosen Sales Departments

Authors: L. Pigon, S. Kot, J. K. Grabara

Abstract:

Nowadays, with organizations facing the challenges of increasing competitiveness, human capital accumulated by the organization is one of the elements that strongly differentiate between companies. Efficient management in the competition area requires to manage the competencies of their employees to be suitable to the market fluctuations. The aim of the paper was to determine how employee training to improve their competencies is verified. The survey was conducted among 37 respondents involved in selection of training providers and training programs in their enterprises. The results showed that all organizations use training survey as a basic method for evaluation of training effectiveness. Depending on the training contents and organization, the questionnaires contain various questions. Most of these surveys are composed of the three basic blocks: the trainer's assessment, the evaluation of the training contents, the assessment of the materials and the place of the organisation. None of the organization surveys conducted regular job-related observations or examined the attitudes of the training participants.

Keywords: human capital, competencies, training effectiveness, sale department

Procedia PDF Downloads 168
15450 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.

Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element

Procedia PDF Downloads 65
15449 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 420
15448 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 347
15447 Mediation in Turkey

Authors: Ibrahim Ercan, Mustafa Arikan

Abstract:

In recent years, alternative dispute resolution methods have attracted the attention of many country’s legislators. Instead of solving the disputes by litigation, putting the end to a dispute by parties themselves is more important for the preservation of social peace. Therefore, alternative dispute resolution methods (ADR) have been discussed more intensively in Turkey as well as the whole world. After these discussions, Mediation Act was adopted on 07.06.2012 and entered into force on 21.06.2013. According to the Mediation Act, it is only possible to mediate issues arising from the private law. Also, it is not compulsory to go to mediation in Turkish law, it is optional. Therefore, the parties are completely free to choose mediation method in dispute resolution. Mediators need to be a lawyer with experience in five years. Therefore, it is not possible to be a mediator who is not lawyers. Beyond five years of experience, getting education and success in exams about especially body language and psychology is also very important to be a mediator. If the parties compromise as a result of mediation, a document is issued. This document will also have the ability to exercising availability under certain circumstances. Thus, the parties will not need to apply to the court again. On the contrary, they will find the opportunity to execute this document, so they can regain their debts. However, the Mediation Act has entered into force in a period of nearly two years of history; it is possible to say that the interest in mediation is not at the expected level. Therefore, making mediation mandatory for some disputes has been discussed recently. At this point, once the mediation becomes mandatory and good results follows it, this institution will be able to find a serious interest in Turkey. Otherwise, if the results will not be satisfying, the mediation method will be removed.

Keywords: alternative dispute resolution methods, mediation act, mediation, mediator, mediation in Turkey

Procedia PDF Downloads 357
15446 Application of Adaptive Particle Filter for Localizing a Mobile Robot Using 3D Camera Data

Authors: Maysam Shahsavari, Seyed Jamalaldin Haddadi

Abstract:

There are several methods to localize a mobile robot such as relative, absolute and probabilistic. In this paper, particle filter due to its simple implementation and the fact that it does not need to know to the starting position will be used. This method estimates the position of the mobile robot using a probabilistic distribution, relying on a known map of the environment instead of predicting it. Afterwards, it updates this estimation by reading input sensors and control commands. To receive information from the surrounding world, distance to obstacles, for example, a Kinect is used which is much cheaper than a laser range finder. Finally, after explaining the Adaptive Particle Filter method and its implementation in detail, we will compare this method with the dead reckoning method and show that this method is much more suitable for situations in which we have a map of the environment.

Keywords: particle filter, localization, methods, odometry, kinect

Procedia PDF Downloads 260
15445 A Review Paper for Detecting Zero-Day Vulnerabilities

Authors: Tshegofatso Rambau, Tonderai Muchenje

Abstract:

Zero-day attacks (ZDA) are increasing day by day; there are many vulnerabilities in systems and software that date back decades. Companies keep discovering vulnerabilities in their systems and software and work to release patches and updates. A zero-day vulnerability is a software fault that is not widely known and is unknown to the vendor; attackers work very quickly to exploit these vulnerabilities. These are major security threats with a high success rate because businesses lack the essential safeguards to detect and prevent them. This study focuses on the factors and techniques that can help us detect zero-day attacks. There are various methods and techniques for detecting vulnerabilities. Various companies like edges can offer penetration testing and smart vulnerability management solutions. We will undertake literature studies on zero-day attacks and detection methods, as well as modeling approaches and simulations, as part of the study process.

Keywords: zero-day attacks, exploitation, vulnerabilities

Procedia PDF Downloads 93
15444 Secure E-Voting Using Blockchain Technology

Authors: Barkha Ramteke, Sonali Ridhorkar

Abstract:

An election is an important event in all countries. Traditional voting has several drawbacks, including the expense of time and effort required for tallying and counting results, the cost of papers, arrangements, and everything else required to complete a voting process. Many countries are now considering online e-voting systems, but the traditional e-voting systems suffer a lack of trust. It is not known if a vote is counted correctly, tampered or not. A lack of transparency means that the voter has no assurance that his or her vote will be counted as they voted in elections. Electronic voting systems are increasingly using blockchain technology as an underlying storage mechanism to make the voting process more transparent and assure data immutability as blockchain technology grows in popularity. The transparent feature, on the other hand, may reveal critical information about applicants because all system users have the same entitlement to their data. Furthermore, because of blockchain's pseudo-anonymity, voters' privacy will be revealed, and third parties involved in the voting process, such as registration institutions, will be able to tamper with data. To overcome these difficulties, we apply Ethereum smart contracts into blockchain-based voting systems.

Keywords: blockchain, AMV chain, electronic voting, decentralized

Procedia PDF Downloads 128
15443 MHC Class II DRB1 Gene Polymorphism in Lori Sheep Breed

Authors: Shahram Nanekarani, Majid Goodarzi, Majid Khosravi

Abstract:

The present study aimed at analyzing of ovine major histocompatibility complex class II (Ovar II) DRB1 gene second exon in Lori Sheep breed. The MHC plays a central role in the control of disease resistance and immunological response. Genomic DNA from blood samples of 124 sheep was extracted and a 296 bp MHC exon 2 fragment was amplified using polymerase chain reaction. PCR products were characterized by the restriction fragment length polymorphism technique using Hin1I restriction enzyme. The PCRRFLP patterns showed three genotypes, AA, AB and BB with frequency of 0.282, 0.573 and 0.145, respectively. There was no significant (P > 0.05) deviation from Hardy–Weinberg equilibrium for this locus in this population. The results of the present study indicate that exon 2 of the Ovar-DRB1 gene is highly polymorphic in Lori sheep and could be considered as an important marker assisted selection, for improvement of immunity in sheep.

Keywords: MHC-DRB1 gene, polymorphism, PCR-RFLP, lori sheep

Procedia PDF Downloads 403
15442 Design and Optimization Fire Alarm System to Protect Gas Condensate Reservoirs With the Use of Nano-Technology

Authors: Hefzollah Mohammadian, Ensieh Hajeb, Mohamad Baqer Heidari

Abstract:

In this paper, for the protection and safety of tanks gases (flammable materials) and also due to the considerable economic value of the reservoir, the new system for the protection, the conservation and fire fighting has been cloned. The system consists of several parts: the Sensors to detect heat and fire with Nanotechnology (nano sensor), Barrier for isolation and protection from a range of two electronic zones, analyzer for detection and locating point of fire accurately, Main electronic board to announce fire, Fault diagnosis in different locations, such as relevant alarms and activate different devices for fire distinguish and announcement. An important feature of this system, high speed and capability of fire detection system in a way that is able to detect the value of the ambient temperature that can be adjusted. Another advantage of this system is autonomous and does not require human operator in place. Using nanotechnology, in addition to speeding up the work, reduces the cost of construction of the sensor and also the notification system and fire extinguish.

Keywords: analyser, barrier, heat resistance, general fault, general alarm, nano sensor

Procedia PDF Downloads 450
15441 Chemometric QSRR Evaluation of Behavior of s-Triazine Pesticides in Liquid Chromatography

Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević

Abstract:

This study considers the selection of the most suitable in silico molecular descriptors that could be used for s-triazine pesticides characterization. Suitable descriptors among topological, geometrical and physicochemical are used for quantitative structure-retention relationships (QSRR) model establishment. Established models were obtained using linear regression (LR) and multiple linear regression (MLR) analysis. In this paper, MLR models were established avoiding multicollinearity among the selected molecular descriptors. Statistical quality of established models was evaluated by standard and cross-validation statistical parameters. For detection of similarity or dissimilarity among investigated s-triazine pesticides and their classification, principal component analysis (PCA) and hierarchical cluster analysis (HCA) were used and gave similar grouping. This study is financially supported by COST action TD1305.

Keywords: chemometrics, classification analysis, molecular descriptors, pesticides, regression analysis

Procedia PDF Downloads 386
15440 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP

Procedia PDF Downloads 85
15439 Evaluation of Microbiological Quality and Safety of Two Types of Salads Prepared at Libyan Airline Catering Center in Tripoli

Authors: Elham A. Kwildi, Yahia S. Abugnah, Nuri S. Madi

Abstract:

This study was designed to evaluate the microbiological quality and safety of two types of salads prepared at a catering center affiliated with Libyan Airlines in Tripoli, Libya. Two hundred and twenty-one (221) samples (132 economy-class and 89 first- class) were used in this project which lasted for ten months. Biweekly, microbiological tests were performed which included total plate count (TPC) and total coliforms (TCF), in addition to enumeration and/or detection of some pathogenic bacteria mainly Escherichia coli, Staphylococcus aureus, Bacillus cereus, Salmonella sp, Listeria sp and Vibrio parahaemolyticus parahaemolyticus, By using conventional as well as compact dry methods. Results indicated that TPC of type 1 salad ranged between (<10 – 62 x 103 cfu/gm) and (<10 to 36 x103 cfu/g), while TCF were (<10 – 41 x 103 cfu/gm) and (< 10 to 66 x102 cfu/g) using both methods of detection respectively. On the other hand, TPC of type 2 salad were: (1 × 10 – 52 x 103) and (<10 – 55 x 103 cfu/gm) and in the range of (1 x10 to 45x103 cfu/g), and the (TCF) counts were between (< 10 to 55x103 cfu/g) and (< 10 to 34 x103 cfu/g) using the 1st and the 2nd methods of detection respectively. Also, the pathogens mentioned above were detected in both types of salads, but their levels varied according to the type of salad and the method of detection. The level of Staphylococcus aureus, for instance, was 17.4% using conventional method versus 14.4% using the compact dry method. Similarly, E. coli was 7.6% and 9.8%, while Salmonella sp. recorded the least percentage i.e. 3% and 3.8% with the two mentioned methods respectively. First class salads were also found to contain the same pathogens, but the level of E. coli was relatively higher in this case (14.6% and 16.9%) using conventional and compact dry methods respectively. The second rank came Staphylococcus aureus (13.5%) and (11.2%), followed by Salmonella (6.74%) and 6.70%). The least percentage was for Vibrio parahaemolyticus (4.9%) which was detected in the first class salads only. The other two pathogens Bacillus cereus and Listeria sp. were not detected in either one of the salads. Finally, it is worth mentioning that there was a significant decline in TPC and TCF counts in addition to the disappearance of pathogenic bacteria after the 6-7th month of the study which coincided with the first trial of the HACCP system at the center. The ups and downs in the counts along the early stages of the study reveal that there is a need for some important correction measures including more emphasis on training of the personnel in applying the HACCP system effectively.

Keywords: air travel, vegetable salads, foodborne outbreaks, Libya

Procedia PDF Downloads 316
15438 A Study on Inverse Determination of Impact Force on a Honeycomb Composite Panel

Authors: Hamed Kalhori, Lin Ye

Abstract:

In this study, an inverse method was developed to reconstruct the magnitude and duration of impact forces exerted to a rectangular carbon fibre-epoxy composite honeycomb sandwich panel. The dynamic signals captured by Piezoelectric (PZT) sensors installed on the panel remotely from the impact locations were utilized to reconstruct the impact force generated by an instrumented hammer through an extended deconvolution approach. Two discretized forms of convolution integral are considered; the traditional one with an explicit transfer function and the modified one without an explicit transfer function. Deconvolution, usually applied to reconstruct the time history (e.g. magnitude) of a stochastic force at a defined location, is extended to identify both the location and magnitude of the impact force among a number of potential impact locations. It is assumed that a number of impact forces are simultaneously exerted to all potential locations, but the magnitude of all forces except one is zero, implicating that the impact occurs only at one location. The extended deconvolution is then applied to determine the magnitude as well as location (among the potential ones), incorporating the linear superposition of responses resulted from impact at each potential location. The problem can be categorized into under-determined (the number of sensors is less than that of impact locations), even-determined (the number of sensors equals that of impact locations), or over-determined (the number of sensors is greater than that of impact locations) cases. For an under-determined case, it comprises three potential impact locations and one PZT sensor for the rectangular carbon fibre-epoxy composite honeycomb sandwich panel. Assessments are conducted to evaluate the factors affecting the precision of the reconstructed force. Truncated Singular Value Decomposition (TSVD) and the Tikhonov regularization are independently chosen to regularize the problem to find the most suitable method for this system. The selection of optimal value of the regularization parameter is investigated through L-curve and Generalized Cross Validation (GCV) methods. In addition, the effect of different width of signal windows on the reconstructed force is examined. It is observed that the impact force generated by the instrumented impact hammer is sensitive to the impact locations of the structure, having a shape from a simple half-sine to a complicated one. The accuracy of the reconstructed impact force is evaluated using the correlation co-efficient between the reconstructed force and the actual one. Based on this criterion, it is concluded that the forces reconstructed by using the extended deconvolution without an explicit transfer function together with Tikhonov regularization match well with the actual forces in terms of magnitude and duration.

Keywords: honeycomb composite panel, deconvolution, impact localization, force reconstruction

Procedia PDF Downloads 529
15437 Human Resources Management Practices in Hospitality Companies

Authors: Dora Martins, Susana Silva, Cândida Silva

Abstract:

Human Resources Management (HRM) has been recognized by academics and practitioners as an important element in organizations. Therefore, this paper explores the best practices of HRM and seeks to understand the level of participation in the development of these practices by human resources managers in the hospitality industry and compare it with other industries. Thus, the study compared the HRM practices of companies in the hospitality sector with HRM practices of companies in other sectors, and identifies the main differences between their HRM practices. The results show that the most frequent HRM practices in all companies, independently of its sector of activity, are hiring and training. When comparing hospitality sector with other sectors of activity, some differences were noticed, namely in the adoption of the practices of communication and information sharing, and of recruitment and selection. According to these results, the paper discusses the major theoretical and practical implications. Suggestions for future research are also presented.

Keywords: exploratory study, human resources management practices, human resources manager, hospitality companies, Portuguese companies

Procedia PDF Downloads 476
15436 Predicting Machine-Down of Woodworking Industrial Machines

Authors: Matteo Calabrese, Martin Cimmino, Dimos Kapetis, Martina Manfrin, Donato Concilio, Giuseppe Toscano, Giovanni Ciandrini, Giancarlo Paccapeli, Gianluca Giarratana, Marco Siciliano, Andrea Forlani, Alberto Carrotta

Abstract:

In this paper we describe a machine learning methodology for Predictive Maintenance (PdM) applied on woodworking industrial machines. PdM is a prominent strategy consisting of all the operational techniques and actions required to ensure machine availability and to prevent a machine-down failure. One of the challenges with PdM approach is to design and develop of an embedded smart system to enable the health status of the machine. The proposed approach allows screening simultaneously multiple connected machines, thus providing real-time monitoring that can be adopted with maintenance management. This is achieved by applying temporal feature engineering techniques and training an ensemble of classification algorithms to predict Remaining Useful Lifetime of woodworking machines. The effectiveness of the methodology is demonstrated by testing an independent sample of additional woodworking machines without presenting machine down event.

Keywords: predictive maintenance, machine learning, connected machines, artificial intelligence

Procedia PDF Downloads 220
15435 Computational Fluid Dynamics Simulation Study of Flow near Moving Wall of Various Surface Types Using Moving Mesh Method

Authors: Khizir Mohd Ismail, Yu Jun Lim, Tshun Howe Yong

Abstract:

The study of flow behavior in an enclosed volume using Computational Fluid Dynamics (CFD) has been around for decades. However, due to the knowledge limitation of adaptive grid methods, the flow in an enclosed volume near the moving wall using CFD is less explored. A CFD simulation of flow in an enclosed volume near a moving wall was demonstrated and studied by introducing a moving mesh method and was modeled with Unsteady Reynolds-Averaged Navier-Stokes (URANS) approach. A static enclosed volume with controlled opening size in the bottom was positioned against a moving, translational wall with sliding mesh features. Controlled variables such as smoothed, crevices and corrugated wall characteristics, the distance between the enclosed volume to the wall and the moving wall speed against the enclosed chamber were varied to understand how the flow behaves and reacts in between these two geometries. These model simulations were validated against experimental results and provided result confidence when the simulation had shown good agreement with the experimental data. This study had provided better insight into the flow behaving in an enclosed volume when various wall types in motion were introduced within the various distance between each other and create a potential opportunity of application which involves adaptive grid methods in CFD.

Keywords: moving wall, adaptive grid methods, CFD, moving mesh method

Procedia PDF Downloads 140
15434 The Importance of Anthropometric Indices for Assessing the Physical Development and Physical Fitness of Young Athletes

Authors: Akbarova Gulnozakhon

Abstract:

Relevance. Physical exercises can prolong the function of the growth zones of long tubular bones, delay the fusion of the epiphyses and diaphyses of bones and, thus, increase the growth of the body. At the same time, intensive strength exercises can accelerate the process of ossification of bone growth zones and slow down their growth in length. The influence of physical exercises on the process of biological maturation is noted. Gymnastics, which requires intense speed and strength loads, delays puberty. On the other hand, it is indicated that the relatively slow puberty of gymnasts is associated with the selection of girls with a special somatotype in this sport. It was found that the later onset of menstruation in female athletes does not have a negative effect on the maturation process and fertility (the ability to procreate). Observations are made about the normalizing influence of sports on the puberty of girls. The purpose of the study. Our goal is to study physical activity of varying intensity on the formation of secondary sexual characteristics and hormonal status of girls in adolescence. Each biological process peculiar to a given organism is not in a stationary state, but fluctuates with a certain frequency. According to the duration, there are, for example, circadian cycles, and infradian cycles, a typical example of which is the menstrual cycle. Materials and methods, results. Violations of menstrual function in athletes were detected by applying a questionnaire survey that contains several paragraphs and sub-paragraphs where passport data, anthropometric indicators, taking into account anthropometric indices, information about the menstrual cycle are indicated. Of 135 female athletes aged 1-3 to 16 years engaged in various sports - gymnasts, menstrual function disorders were noted in 86.7% (primary or secondary amenorrhea, irregular MC), in swimming-in 57.1%. The general condition also changes during the menstrual cycle. In a large percentage of cases, athletes indicate an increase in irritability in the premenstrual (45%) and menstrual (36%) phases. During these phases, girls note an increase in fatigue of 46.5% and 58% (respectively). In girls, secondary sexual characteristics continue to form during puberty and the clearest indicator of the onset of puberty is the age of the onset of the first menstruation - menarche. Conclusions. 1. Physical exercise has a positive effect on all major systems of the body and thus promotes health.2. Along with a beneficial effect on human health, physical exercise, if the requirements of sports are not observed, can be harmful.

Keywords: girls health, anthropometric, physical development, reproductive health

Procedia PDF Downloads 100