Search results for: co-authorship network analysis
27755 Development and Psychometric Evaluation of the Malaysian Multi-Ethnic Discrimination Scale
Authors: Chua Bee Seok, Shamsul Amri Baharuddin, Ferlis Bahari, Jasmine Adela Mutang, Lailawati Madlan, Rosnah Ismail, Asong Joseph
Abstract:
Malaysia is a country famously known for its multiple unique cultural and ethnic diversities. Despite the diversity of culture, customs and beliefs, respectively, Malaysia still be able to stand as a harmonious country. However, if there is an attitude of stereotypes, prejudice and discrimination among ethnic, it may seriously affect the solidarity between people in Malaysia. Thus, this study focuses on constructing a scale measuring the Malaysian experience, strategy and effect of ethnic discrimination. To develop a quantitative measure on ethnic discrimination directed against Malaysian, a three-step process is proposed: Exploratory factor analysis, validity analysis, and internal consistency reliability analysis. Results, limitations, and implications of the study are discussed.Keywords: test development, Malaysian multi-ethnic discrimination scale, exploratory factor analysis, validity, multi-ethnic, reliability, psychometrics
Procedia PDF Downloads 74327754 Assessing the Influence of Station Density on Geostatistical Prediction of Groundwater Levels in a Semi-arid Watershed of Karnataka
Authors: Sakshi Dhumale, Madhushree C., Amba Shetty
Abstract:
The effect of station density on the geostatistical prediction of groundwater levels is of critical importance to ensure accurate and reliable predictions. Monitoring station density directly impacts the accuracy and reliability of geostatistical predictions by influencing the model's ability to capture localized variations and small-scale features in groundwater levels. This is particularly crucial in regions with complex hydrogeological conditions and significant spatial heterogeneity. Insufficient station density can result in larger prediction uncertainties, as the model may struggle to adequately represent the spatial variability and correlation patterns of the data. On the other hand, an optimal distribution of monitoring stations enables effective coverage of the study area and captures the spatial variability of groundwater levels more comprehensively. In this study, we investigate the effect of station density on the predictive performance of groundwater levels using the geostatistical technique of Ordinary Kriging. The research utilizes groundwater level data collected from 121 observation wells within the semi-arid Berambadi watershed, gathered over a six-year period (2010-2015) from the Indian Institute of Science (IISc), Bengaluru. The dataset is partitioned into seven subsets representing varying sampling densities, ranging from 15% (12 wells) to 100% (121 wells) of the total well network. The results obtained from different monitoring networks are compared against the existing groundwater monitoring network established by the Central Ground Water Board (CGWB). The findings of this study demonstrate that higher station densities significantly enhance the accuracy of geostatistical predictions for groundwater levels. The increased number of monitoring stations enables improved interpolation accuracy and captures finer-scale variations in groundwater levels. These results shed light on the relationship between station density and the geostatistical prediction of groundwater levels, emphasizing the importance of appropriate station densities to ensure accurate and reliable predictions. The insights gained from this study have practical implications for designing and optimizing monitoring networks, facilitating effective groundwater level assessments, and enabling sustainable management of groundwater resources.Keywords: station density, geostatistical prediction, groundwater levels, monitoring networks, interpolation accuracy, spatial variability
Procedia PDF Downloads 6627753 Object Oriented Fault Tree Analysis Methodology
Abstract:
Traditional safety, risk and reliability analysis approaches are problem-oriented, which make it great workload when analyzing complicated and huge system, besides, too much repetitive work would to do if the analyzed system composed by many similar components. It is pressing need an object and function oriented approach to maintain high consistency with problem domain. A new approach is proposed to overcome these shortcomings of traditional approaches, the concepts: class, abstract, inheritance, polymorphism and encapsulation are introduced into FTA and establish the professional class library that the abstractions of physical objects in real word, four areas relevant information also be proposed as the establish help guide. The interaction between classes is completed by the inside or external methods that mapping the attributes to base events through fully search the knowledge base, which forms good encapsulation. The object oriented fault tree analysis system that analyze and evaluate the system safety and reliability according to the original appearance of the problem is set up, where could mapped directly from the class and object to the problem domain of the fault tree analysis. All the system failure situations can be analyzed through this bottom-up fault tree construction approach. Under this approach architecture, FTA approach is developed, which avoids the human influence of the analyst on analysis results. It reveals the inherent safety problems of analyzed system itself and provides a new way of thinking and development for safety analysis. So that object oriented technology in the field of safety applications and development, safety theory is conducive to innovation.Keywords: FTA, knowledge base, object-oriented technology, reliability analysis
Procedia PDF Downloads 25027752 Detection of Important Biological Elements in Drug-Drug Interaction Occurrence
Authors: Reza Ferdousi, Reza Safdari, Yadollah Omidi
Abstract:
Drug-drug interactions (DDIs) are main cause of the adverse drug reactions and nature of the functional and molecular complexity of drugs behavior in human body make them hard to prevent and treat. With the aid of new technologies derived from mathematical and computational science the DDIs problems can be addressed with minimum cost and efforts. Market basket analysis is known as powerful method to identify co-occurrence of thing to discover patterns and frequency of the elements. In this research, we used market basket analysis to identify important bio-elements in DDIs occurrence. For this, we collected all known DDIs from DrugBank. The obtained data were analyzed by market basket analysis method. We investigated all drug-enzyme, drug-carrier, drug-transporter and drug-target associations. To determine the importance of the extracted bio-elements, extracted rules were evaluated in terms of confidence and support. Market basket analysis of the over 45,000 known DDIs reveals more than 300 important rules that can be used to identify DDIs, CYP 450 family were the most frequent shared bio-elements. We applied extracted rules over 2,000,000 unknown drug pairs that lead to discovery of more than 200,000 potential DDIs. Analysis of the underlying reason behind the DDI phenomena can help to predict and prevent DDI occurrence. Ranking of the extracted rules based on strangeness of them can be a supportive tool to predict the outcome of an unknown DDI.Keywords: drug-drug interaction, market basket analysis, rule discovery, important bio-elements
Procedia PDF Downloads 31527751 Depth Estimation in DNN Using Stereo Thermal Image Pairs
Authors: Ahmet Faruk Akyuz, Hasan Sakir Bilge
Abstract:
Depth estimation using stereo images is a challenging problem in computer vision. Many different studies have been carried out to solve this problem. With advancing machine learning, tackling this problem is often done with neural network-based solutions. The images used in these studies are mostly in the visible spectrum. However, the need to use the Infrared (IR) spectrum for depth estimation has emerged because it gives better results than visible spectra in some conditions. At this point, we recommend using thermal-thermal (IR) image pairs for depth estimation. In this study, we used two well-known networks (PSMNet, FADNet) with minor modifications to demonstrate the viability of this idea.Keywords: thermal stereo matching, deep neural networks, CNN, Depth estimation
Procedia PDF Downloads 28227750 Analysis of CO₂ Two-Phase Ejector with Taguchi and ANOVA Optimization and Refrigerant Selection with Enviro Economic Concerns by TOPSIS Analysis
Authors: Karima Megdouli, Bourhan tachtouch
Abstract:
Ejector refrigeration cycles offer an alternative to conventional systems for producing cold from low-temperature heat. In this article, a thermodynamic model is presented. This model has the advantage of simplifying the calculation algorithm and describes the complex double-throttling mechanism that occurs in the ejector. The model assumption and calculation algorithm are presented first. The impact of each efficiency is evaluated. Validation is performed on several data sets. The ejector model is then used to simulate a RES (refrigeration ejector system), to validate its robustness and suitability for use in predicting thermodynamic cycle performance. A Taguchi and ANOVA optimization is carried out on a RES. TOPSIS analysis was applied to decide the optimum refrigerants with cost, safety, environmental and enviro economic concerns along with thermophysical properties.Keywords: ejector, velocity distribution, shock circle, Taguchi and ANOVA optimization, TOPSIS analysis
Procedia PDF Downloads 9127749 An Analysis of the Movie “Sunset Boulevard” through the Transactional Analysis Paradigm
Authors: Borislava Dimitrova, Didem Kepir Savoly
Abstract:
The movie analysis offers a dynamic and multifaceted lens in order to explore and understand various aspects of human behavior and relationship, emotion, and cognition. Cinema therapy can be an important tool for counselor education and counselors in therapy. Therefore, this paper aims to delve deeper into human relationships and individual behavior patterns and analyze some of their most vivid aspects in light of the transactional analysis and its main components. While describing certain human behaviors and emotional states in real life, sometimes it can be difficult even for mental health practitioners to become aware of the subtle social cues and hints that are being transmitted, often in a rushed and swift manner. To address this challenge, the current paper focuses on the relationship dynamics as conveyed through the plot of the movie “Sunset Boulevard”, and examines slightly exaggerated yet true-to-life examples. The movie was directed by Billy Wilder and written by Charles Brackett, Billy Wilder, and D.M. Marshman Jr. The scenes of interest were examined through Transactional Analysis concepts: the different ego states, strokes, the various kinds of transactions, the paradigm of games in transactional analysis, and lastly, with the help of the drama triangle. The addressed themes comprised mainly the way the main characters engaged in game playing, which eventually had a negative outcome on the sequences of interactions between the individuals and the desired payoffs that they craved as a result. Furthermore, counselor educators can use the result of this paper for educational purposes, such as for teaching theoretical knowledge about Transactional Analysis, and for utilizing characters’ interactions and behaviors as real-life situations that can serve as case studies and role-playing activities. Finally, the paper aims to foster the use of movies as materials for psychological analysis which can assist the teaching of new mental health professionals in the field.Keywords: transactional analysis, movie analysis, drama triangle, games, ego-state
Procedia PDF Downloads 10627748 An Analysis of the Relation between Need for Psychological Help and Psychological Symptoms
Authors: İsmail Ay
Abstract:
In this study, it was aimed to determine the relations between need for psychological help and psychological symptoms. The sample of the study consists of 530 university students getting educated in University of Atatürk in 2015-2016 academic years. Need for Psychological Help Scale and Brief Symptom Inventory were used to collect data in the study. In data analysis, correlation analysis and structural equation model with latent variables were used. Normality and homogeneity analyses were used to analyze the basic conditions of parametric tests. The findings obtained from the study show that as the psychological symptoms increase, need for psychological help also increases. The findings obtained through the study were approached according to the literature.Keywords: psychological symptoms, need for psychological help, structural equation model, correlation
Procedia PDF Downloads 37327747 Analisys of Cereal Flours by Fluorescence Spectroscopy and PARAFAC
Authors: Lea Lenhardt, Ivana Zeković, Tatjana Dramićanin, Miroslav D. Dramićanin
Abstract:
Rapid and sensitive analytical technologies for food analysis are needed to respond to the growing public interest in food quality and safety. In this context, fluorescence spectroscopy offers several inherent advantages for the characterization of food products: high sensitivity, low price, objective, relatively fast and non-destructive. The objective of this work was to investigate the potential of fluorescence spectroscopy coupled with multi-way technique for characterization of cereal flours. Fluorescence landscape also known as excitation-emission matrix (EEM) spectroscopy utilizes multiple-color illumination, with the full fluorescence spectrum recorded for each excitation wavelength. EEM was measured on various types of cereal flours (wheat, oat, barley, rye, corn, buckwheat and rice). Obtained spectra were analyzed using PARAllel FACtor analysis (PARAFAC) in order to decompose the spectra and identify underlying fluorescent components. Results of the analysis indicated the presence of four fluorophores in cereal flours. It has been observed that relative concentration of fluorophores varies between different groups of flours. Based on these findings we can conclude that application of PARAFAC analysis on fluorescence data is a good foundation for further qualitative analysis of cereal flours.Keywords: cereals, fluors, fluorescence, PARAFAC
Procedia PDF Downloads 66727746 Finite Element Analysis of Connecting Rod
Authors: Mohammed Mohsin Ali H., Mohamed Haneef
Abstract:
The connecting rod transmits the piston load to the crank causing the latter to turn, thus converting the reciprocating motion of the piston into a rotary motion of the crankshaft. Connecting rods are subjected to forces generated by mass and fuel combustion. This study investigates and compares the fatigue behavior of forged steel, powder forged and ASTM a 514 steel cold quenched connecting rods. The objective is to suggest for a new material with reduced weight and cost with the increased fatigue life. This has entailed performing a detailed load analysis. Therefore, this study has dealt with two subjects: first, dynamic load and stress analysis of the connecting rod, and second, optimization for material, weight and cost. In the first part of the study, the loads acting on the connecting rod as a function of time were obtained. Based on the observations of the dynamic FEA, static FEA, and the load analysis results, the load for the optimization study was selected. It is the conclusion of this study that the connecting rod can be designed and optimized under a load range comprising tensile load and compressive load. Tensile load corresponds to 360o crank angle at the maximum engine speed. The compressive load is corresponding to the peak gas pressure. Furthermore, the existing connecting rod can be replaced with a new connecting rod made of ASTM a 514 steel cold quenched that is 12% lighter and 28% cheaper.Keywords: connecting rod, ASTM a514 cold quenched material, static analysis, fatigue analysis, stress life approach
Procedia PDF Downloads 30327745 Concept of Automation in Management of Electric Power Systems
Authors: Richard Joseph, Nerey Mvungi
Abstract:
An electric power system includes a generating, a transmission, a distribution and consumers subsystems. An electrical power network in Tanzania keeps growing larger by the day and become more complex so that, most utilities have long wished for real-time monitoring and remote control of electrical power system elements such as substations, intelligent devices, power lines, capacitor banks, feeder switches, fault analyzers and other physical facilities. In this paper, the concept of automation of management of power systems from generation level to end user levels was determined by using Power System Simulator for Engineering (PSS/E) version 30.3.2.Keywords: automation, distribution subsystem, generating subsystem, PSS/E, TANESCO, transmission subsystem
Procedia PDF Downloads 67827744 Cloud-Based Mobile-to-Mobile Computation Offloading
Authors: Ebrahim Alrashed, Yousef Rafique
Abstract:
Mobile devices have drastically changed the way we do things on the move. They are being extremely relied on to perform tasks that are analogous to desktop computer capability. There has been a rapid increase of computational power on these devices; however, battery technology is still the bottleneck of evolution. The primary modern approach day approach to tackle this issue is offloading computation to the cloud, proving to be latency expensive and requiring high network bandwidth. In this paper, we explore efforts to perform barter-based mobile-to-mobile offloading. We present define a protocol and present an architecture to facilitate the development of such a system. We further highlight the deployment and security challenges.Keywords: computational offloading, power conservation, cloud, sandboxing
Procedia PDF Downloads 38927743 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning
Authors: Yangzhi Li
Abstract:
Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.Keywords: robotic construction, robotic assembly, visual guidance, machine learning
Procedia PDF Downloads 8827742 Time-Domain Analysis of Pulse Parameters Effects on Crosstalk in High-Speed Circuits
Authors: Loubna Tani, Nabih Elouzzani
Abstract:
Crosstalk among interconnects and printed-circuit board (PCB) traces is a major limiting factor of signal quality in high-speed digital and communication equipments especially when fast data buses are involved. Such a bus is considered as a planar multiconductor transmission line. This paper will demonstrate how the finite difference time domain (FDTD) method provides an exact solution of the transmission-line equations to analyze the near end and the far end crosstalk. In addition, this study makes it possible to analyze the rise time effect on the near and far end voltages of the victim conductor. The paper also discusses a statistical analysis, based upon a set of several simulations. Such analysis leads to a better understanding of the phenomenon and yields useful information.Keywords: multiconductor transmission line, crosstalk, finite difference time domain (FDTD), printed-circuit board (PCB), rise time, statistical analysis
Procedia PDF Downloads 43627741 Typology of Fake News Dissemination Strategies in Social Networks in Social Events
Authors: Mohadese Oghbaee, Borna Firouzi
Abstract:
The emergence of the Internet and more specifically the formation of social media has provided the ground for paying attention to new types of content dissemination. In recent years, Social media users share information, communicate with others, and exchange opinions on social events in this space. Many of the information published in this space are suspicious and produced with the intention of deceiving others. These contents are often called "fake news". Fake news, by disrupting the circulation of the concept and similar concepts such as fake news with correct information and misleading public opinion, has the ability to endanger the security of countries and deprive the audience of the basic right of free access to real information; Competing governments, opposition elements, profit-seeking individuals and even competing organizations, knowing about this capacity, act to distort and overturn the facts in the virtual space of the target countries and communities on a large scale and influence public opinion towards their goals. This process of extensive de-truthing of the information space of the societies has created a wave of harm and worries all over the world. The formation of these concerns has led to the opening of a new path of research for the timely containment and reduction of the destructive effects of fake news on public opinion. In addition, the expansion of this phenomenon has the potential to create serious and important problems for societies, and its impact on events such as the 2016 American elections, Brexit, 2017 French elections, 2019 Indian elections, etc., has caused concerns and led to the adoption of approaches It has been dealt with. In recent years, a simple look at the growth trend of research in "Scopus" shows an increasing increase in research with the keyword "false information", which reached its peak in 2020, namely 524 cases, reached, while in 2015, only 30 scientific-research contents were published in this field. Considering that one of the capabilities of social media is to create a context for the dissemination of news and information, both true and false, in this article, the classification of strategies for spreading fake news in social networks was investigated in social events. To achieve this goal, thematic analysis research method was chosen. In this way, an extensive library study was first conducted in global sources. Then, an in-depth interview was conducted with 18 well-known specialists and experts in the field of news and media in Iran. These experts were selected by purposeful sampling. Then by analyzing the data using the theme analysis method, strategies were obtained; The strategies achieved so far (research is in progress) include unrealistically strengthening/weakening the speed and content of the event, stimulating psycho-media movements, targeting emotional audiences such as women, teenagers and young people, strengthening public hatred, calling the reaction legitimate/illegitimate. events, incitement to physical conflict, simplification of violent protests and targeted publication of images and interviews were introduced.Keywords: fake news, social network, social events, thematic analysis
Procedia PDF Downloads 6527740 Green Wave Control Strategy for Optimal Energy Consumption by Model Predictive Control in Electric Vehicles
Authors: Furkan Ozkan, M. Selcuk Arslan, Hatice Mercan
Abstract:
Electric vehicles are becoming increasingly popular asa sustainable alternative to traditional combustion engine vehicles. However, to fully realize the potential of EVs in reducing environmental impact and energy consumption, efficient control strategies are essential. This study explores the application of green wave control using model predictive control for electric vehicles, coupled with energy consumption modeling using neural networks. The use of MPC allows for real-time optimization of the vehicles’ energy consumption while considering dynamic traffic conditions. By leveraging neural networks for energy consumption modeling, the EV's performance can be further enhanced through accurate predictions and adaptive control. The integration of these advanced control and modeling techniques aims to maximize energy efficiency and range while navigating urban traffic scenarios. The findings of this research offer valuable insights into the potential of green wave control for electric vehicles and demonstrate the significance of integrating MPC and neural network modeling for optimizing energy consumption. This work contributes to the advancement of sustainable transportation systems and the widespread adoption of electric vehicles. To evaluate the effectiveness of the green wave control strategy in real-world urban environments, extensive simulations were conducted using a high-fidelity vehicle model and realistic traffic scenarios. The results indicate that the integration of model predictive control and energy consumption modeling with neural networks had a significant impact on the energy efficiency and range of electric vehicles. Through the use of MPC, the electric vehicle was able to adapt its speed and acceleration profile in realtime to optimize energy consumption while maintaining travel time objectives. The neural network-based energy consumption modeling provided accurate predictions, enabling the vehicle to anticipate and respond to variations in traffic flow, further enhancing energy efficiency and range. Furthermore, the study revealed that the green wave control strategy not only reduced energy consumption but also improved the overall driving experience by minimizing abrupt acceleration and deceleration, leading to a smoother and more comfortable ride for passengers. These results demonstrate the potential for green wave control to revolutionize urban transportation by enhancing the performance of electric vehicles and contributing to a more sustainable and efficient mobility ecosystem.Keywords: electric vehicles, energy efficiency, green wave control, model predictive control, neural networks
Procedia PDF Downloads 5827739 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators
Authors: Fathi Abid, Bilel Kaffel
Abstract:
The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode
Procedia PDF Downloads 34027738 Pattern Identification in Statistical Process Control Using Artificial Neural Networks
Authors: M. Pramila Devi, N. V. N. Indra Kiran
Abstract:
Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping
Procedia PDF Downloads 37427737 A Systamatic Review on Experimental, FEM Analysis and Simulation of Metal Spinning Process
Authors: Amol M. Jadhav, Sharad S. Chudhari, S. S. Khedkar
Abstract:
This review presents a through survey of research paper work on the experimental analysis, FEM Analysis & simulation of the metal spinning process. In this literature survey all the papers being taken from Elsevier publication and most of the from journal of material processing technology. In a last two decade or so, metal spinning process gradually used as chip less formation for the production of engineering component in a small to medium batch quantities. The review aims to provide include into the experimentation, FEM analysis of various components, simulation of metal spinning process and act as guide for research working on metal spinning processes. The review of existing work has several gaps in current knowledge of metal spinning processes. The evaluation of experiment is thickness strain, the spinning force, the twisting angle, the surface roughness of the conventional & shear metal spinning process; the evaluation of FEM of metal spinning to path definition with sufficient fine mesh to capture behavior of work piece; The evaluation of feed rate of roller, direction of roller,& type of roller stimulated. The metal spinning process has the more flexible to produce a wider range of product shape & to form more challenge material.Keywords: metal spinning, FEM analysis, simulation of metal spinning, mechanical engineering
Procedia PDF Downloads 38827736 Social Media Mining with R. Twitter Analyses
Authors: Diana Codat
Abstract:
Tweets' analysis is part of text mining. Each document is a written text. It's possible to apply the usual text search techniques, in particular by switching to the bag-of-words representation. But the tweets induce peculiarities. Some may enrich the analysis. Thus, their length is calibrated (at least as far as public messages are concerned), special characters make it possible to identify authors (@) and themes (#), the tweet and retweet mechanisms make it possible to follow the diffusion of the information. Conversely, other characteristics may disrupt the analyzes. Because space is limited, authors often use abbreviations, emoticons to express feelings, and they do not pay much attention to spelling. All this creates noise that can complicate the task. The tweets carry a lot of potentially interesting information. Their exploitation is one of the main axes of the analysis of the social networks. We show how to access Twitter-related messages. We will initiate a study of the properties of the tweets, and we will follow up on the exploitation of the content of the messages. We will work under R with the package 'twitteR'. The study of tweets is a strong focus of analysis of social networks because Twitter has become an important vector of communication. This example shows that it is easy to initiate an analysis from data extracted directly online. The data preparation phase is of great importance.Keywords: data mining, language R, social networks, Twitter
Procedia PDF Downloads 18527735 An Exploratory Research of Human Character Analysis Based on Smart Watch Data: Distinguish the Drinking State from Normal State
Authors: Lu Zhao, Yanrong Kang, Lili Guo, Yuan Long, Guidong Xing
Abstract:
Smart watches, as a handy device with rich functionality, has become one of the most popular wearable devices all over the world. Among the various function, the most basic is health monitoring. The monitoring data can be provided as an effective evidence or a clue for the detection of crime cases. For instance, the step counting data can help to determine whether the watch wearer was quiet or moving during the given time period. There is, however, still quite few research on the analysis of human character based on these data. The purpose of this research is to analyze the health monitoring data to distinguish the drinking state from normal state. The analysis result may play a role in cases involving drinking, such as drunk driving. The experiment mainly focused on finding the figures of smart watch health monitoring data that change with drinking and figuring up the change scope. The chosen subjects are mostly in their 20s, each of whom had been wearing the same smart watch for a week. Each subject drank for several times during the week, and noted down the begin and end time point of the drinking. The researcher, then, extracted and analyzed the health monitoring data from the watch. According to the descriptive statistics analysis, it can be found that the heart rate change when drinking. The average heart rate is about 10% higher than normal, the coefficient of variation is less than about 30% of the normal state. Though more research is needed to be carried out, this experiment and analysis provide a thought of the application of the data from smart watches.Keywords: character analysis, descriptive statistics analysis, drink state, heart rate, smart watch
Procedia PDF Downloads 16827734 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 19827733 Optimal Designof Brush Roll for Semiconductor Wafer Using CFD Analysis
Authors: Byeong-Sam Kim, Kyoungwoo Park
Abstract:
This research analyzes structure of flat panel display (FPD) such as LCD as quantitative through CFD analysis and modeling change to minimize the badness rate and rate of production decrease by damage of large scale plater at wafer heating chamber at semi-conductor manufacturing process. This glass panel and wafer device with atmospheric pressure or chemical vapor deposition equipment for transporting and transferring wafers, robot hands carry these longer and wider wafers can also be easily handled. As a contact handling system composed of several problems in increased potential for fracture or warping. A non-contact handling system is required to solve this problem. The panel and wafer warping makes it difficult to carry out conventional contact to analysis. We propose a new non-contact transportation system with combining air suction and blowout. The numerical analysis and experimental is, therefore, should be performed to obtain compared to results achieved with non-contact solutions. This wafer panel noncontact handler shows its strength in maintaining high cleanliness levels for semiconductor production processes.Keywords: flat panel display, non contact transportation, heat treatment process, CFD analysis
Procedia PDF Downloads 41727732 The Use of Unmanned Aerial System (UAS) in Improving the Measurement System on the Example of Textile Heaps
Authors: Arkadiusz Zurek
Abstract:
The potential of using drones is visible in many areas of logistics, especially in terms of their use for monitoring and control of many processes. The technologies implemented in the last decade concern new possibilities for companies that until now have not even considered them, such as warehouse inventories. Unmanned aerial vehicles are no longer seen as a revolutionary tool for Industry 4.0, but rather as tools in the daily work of factories and logistics operators. The research problem is to develop a method for measuring the weight of goods in a selected link of the clothing supply chain by drones. However, the purpose of this article is to analyze the causes of errors in traditional measurements, and then to identify adverse events related to the use of drones for the inventory of a heap of textiles intended for production purposes. On this basis, it will be possible to develop guidelines to eliminate the causes of these events in the measurement process using drones. In a real environment, work was carried out to determine the volume and weight of textiles, including, among others, weighing a textile sample to determine the average density of the assortment, establishing a local geodetic network, terrestrial laser scanning and photogrammetric raid using an unmanned aerial vehicle. As a result of the analysis of measurement data obtained in the facility, the volume and weight of the assortment and the accuracy of their determination were determined. In this article, this work presents how such heaps are currently being tested, what adverse events occur, indicate and describes the current use of photogrammetric techniques of this type of measurements so far performed by external drones for the inventory of wind farms or construction of the station and compare them with the measurement system of the aforementioned textile heap inside a large-format facility.Keywords: drones, unmanned aerial system, UAS, indoor system, security, process automation, cost optimization, photogrammetry, risk elimination, industry 4.0
Procedia PDF Downloads 8927731 Vibration Propagation in Body-in-White Structures Through Structural Intensity Analysis
Authors: Jamal Takhchi
Abstract:
The understanding of vibration propagation in complex structures such as automotive body in white remains a challenging issue in car design regarding NVH performances. The current analysis is limited to the low frequency range where modal concepts are dominant. Higher frequencies, between 200 and 1000 Hz, will become critical With the rise of electrification. EVs annoying sounds are mostly whines created by either Gears or e-motors between 300 Hz and 2 kHz. Structural intensity analysis was Experienced a few years ago on finite element models. The application was promising but limited by the fact that the propagating 3D intensity vector field is masked by a rotational Intensity field. This rotational field should be filtered using a differential operator. The expression of this operator in the framework of finite element modeling is not yet known. The aim of the proposed work is to implement this operator in the current dynamic solver (NASTRAN) of Stellantis and develop the Expected methodology for the mid-frequency structural analysis of electrified vehicles.Keywords: structural intensity, NVH, body in white, irrotatational intensity
Procedia PDF Downloads 15727730 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 9527729 Response Analysis of a Steel Reinforced Concrete High-Rise Building during the 2011 Tohoku Earthquake
Authors: Naohiro Nakamura, Takuya Kinoshita, Hiroshi Fukuyama
Abstract:
The 2011 off The Pacific Coast of Tohoku Earthquake caused considerable damage to wide areas of eastern Japan. A large number of earthquake observation records were obtained at various places. To design more earthquake-resistant buildings and improve earthquake disaster prevention, it is necessary to utilize these data to analyze and evaluate the behavior of a building during an earthquake. This paper presents an earthquake response simulation analysis (hereafter a seismic response analysis) that was conducted using data recorded during the main earthquake (hereafter the main shock) as well as the earthquakes before and after it. The data were obtained at a high-rise steel-reinforced concrete (SRC) building in the bay area of Tokyo. We first give an overview of the building, along with the characteristics of the earthquake motion and the building during the main shock. The data indicate that there was a change in the natural period before and after the earthquake. Next, we present the results of our seismic response analysis. First, the analysis model and conditions are shown, and then, the analysis result is compared with the observational records. Using the analysis result, we then study the effect of soil-structure interaction on the response of the building. By identifying the characteristics of the building during the earthquake (i.e., the 1st natural period and the 1st damping ratio) by the Auto-Regressive eXogenous (ARX) model, we compare the analysis result with the observational records so as to evaluate the accuracy of the response analysis. In this study, a lumped-mass system SR model was used to conduct a seismic response analysis using observational data as input waves. The main results of this study are as follows: 1) The observational records of the 3/11 main shock put it between a level 1 and level 2 earthquake. The result of the ground response analysis showed that the maximum shear strain in the ground was about 0.1% and that the possibility of liquefaction occurring was low. 2) During the 3/11 main shock, the observed wave showed that the eigenperiod of the building became longer; this behavior could be generally reproduced in the response analysis. This prolonged eigenperiod was due to the nonlinearity of the superstructure, and the effect of the nonlinearity of the ground seems to have been small. 3) As for the 4/11 aftershock, a continuous analysis in which the subject seismic wave was input after the 3/11 main shock was input was conducted. The analyzed values generally corresponded well with the observed values. This means that the effect of the nonlinearity of the main shock was retained by the building. It is important to consider this when conducting the response evaluation. 4) The first period and the damping ratio during a vibration were evaluated by an ARX model. Our results show that the response analysis model in this study is generally good at estimating a change in the response of the building during a vibration.Keywords: ARX model, response analysis, SRC building, the 2011 off the Pacific Coast of Tohoku Earthquake
Procedia PDF Downloads 16527728 ACOPIN: An ACO Algorithm with TSP Approach for Clustering Proteins in Protein Interaction Networks
Authors: Jamaludin Sallim, Rozlina Mohamed, Roslina Abdul Hamid
Abstract:
In this paper, we proposed an Ant Colony Optimization (ACO) algorithm together with Traveling Salesman Problem (TSP) approach to investigate the clustering problem in Protein Interaction Networks (PIN). We named this combination as ACOPIN. The purpose of this work is two-fold. First, to test the efficacy of ACO in clustering PIN and second, to propose the simple generalization of the ACO algorithm that might allow its application in clustering proteins in PIN. We split this paper to three main sections. First, we describe the PIN and clustering proteins in PIN. Second, we discuss the steps involved in each phase of ACO algorithm. Finally, we present some results of the investigation with the clustering patterns.Keywords: ant colony optimization algorithm, searching algorithm, protein functional module, protein interaction network
Procedia PDF Downloads 61427727 Rescaled Range Analysis of Seismic Time-Series: Example of the Recent Seismic Crisis of Alhoceima
Authors: Marina Benito-Parejo, Raul Perez-Lopez, Miguel Herraiz, Carolina Guardiola-Albert, Cesar Martinez
Abstract:
Persistency, long-term memory and randomness are intrinsic properties of time-series of earthquakes. The Rescaled Range Analysis (RS-Analysis) was introduced by Hurst in 1956 and modified by Mandelbrot and Wallis in 1964. This method represents a simple and elegant analysis which determines the range of variation of one natural property (the seismic energy released in this case) in a time interval. Despite the simplicity, there is complexity inherent in the property measured. The cumulative curve of the energy released in time is the well-known fractal geometry of a devil’s staircase. This geometry is used for determining the maximum and minimum value of the range, which is normalized by the standard deviation. The rescaled range obtained obeys a power-law with the time, and the exponent is the Hurst value. Depending on this value, time-series can be classified in long-term or short-term memory. Hence, an algorithm has been developed for compiling the RS-Analysis for time series of earthquakes by days. Completeness time distribution and locally stationarity of the time series are required. The interest of this analysis is their application for a complex seismic crisis where different earthquakes take place in clusters in a short period. Therefore, the Hurst exponent has been obtained for the seismic crisis of Alhoceima (Mediterranean Sea) of January-March, 2016, where at least five medium-sized earthquakes were triggered. According to the values obtained from the Hurst exponent for each cluster, a different mechanical origin can be detected, corroborated by the focal mechanisms calculated by the official institutions. Therefore, this type of analysis not only allows an approach to a greater understanding of a seismic series but also makes possible to discern different types of seismic origins.Keywords: Alhoceima crisis, earthquake time series, Hurst exponent, rescaled range analysis
Procedia PDF Downloads 32627726 Sensitivity Analysis during the Optimization Process Using Genetic Algorithms
Authors: M. A. Rubio, A. Urquia
Abstract:
Genetic algorithms (GA) are applied to the solution of high-dimensional optimization problems. Additionally, sensitivity analysis (SA) is usually carried out to determine the effect on optimal solutions of changes in parameter values of the objective function. These two analyses (i.e., optimization and sensitivity analysis) are computationally intensive when applied to high-dimensional functions. The approach presented in this paper consists in performing the SA during the GA execution, by statistically analyzing the data obtained of running the GA. The advantage is that in this case SA does not involve making additional evaluations of the objective function and, consequently, this proposed approach requires less computational effort than conducting optimization and SA in two consecutive steps.Keywords: optimization, sensitivity, genetic algorithms, model calibration
Procedia PDF Downloads 439