Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28321

Search results for: raw complex data

26611 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines

Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma

Abstract:

Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.

Keywords: support vector mechanism (SVM), machine learning (ML), support vector machines (SVM), department of transportation (DFT)

Procedia PDF Downloads 267
26610 A Relational Data Base for Radiation Therapy

Authors: Raffaele Danilo Esposito, Domingo Planes Meseguer, Maria Del Pilar Dorado Rodriguez

Abstract:

As far as we know, it is still unavailable a commercial solution which would allow to manage, openly and configurable up to user needs, the huge amount of data generated in a modern Radiation Oncology Department. Currently, available information management systems are mainly focused on Record & Verify and clinical data, and only to a small extent on physical data. Thus, results in a partial and limited use of the actually available information. In the present work we describe the implementation at our department of a centralized information management system based on a web server. Our system manages both information generated during patient planning and treatment, and information of general interest for the whole department (i.e. treatment protocols, quality assurance protocols etc.). Our objective it to be able to analyze in a simple and efficient way all the available data and thus to obtain quantitative evaluations of our treatments. This would allow us to improve our work flow and protocols. To this end we have implemented a relational data base which would allow us to use in a practical and efficient way all the available information. As always we only use license free software.

Keywords: information management system, radiation oncology, medical physics, free software

Procedia PDF Downloads 232
26609 A Study of Safety of Data Storage Devices of Graduate Students at Suan Sunandha Rajabhat University

Authors: Komol Phaisarn, Natcha Wattanaprapa

Abstract:

This research is a survey research with an objective to study the safety of data storage devices of graduate students of academic year 2013, Suan Sunandha Rajabhat University. Data were collected by questionnaire on the safety of data storage devices according to CIA principle. A sample size of 81 was drawn from population by purposive sampling method. The results show that most of the graduate students of academic year 2013 at Suan Sunandha Rajabhat University use handy drive to store their data and the safety level of the devices is at good level.

Keywords: security, safety, storage devices, graduate students

Procedia PDF Downloads 349
26608 Changing Behaviour in the Digital Era: A Concrete Use Case from the Domain of Health

Authors: Francesca Spagnoli, Shenja van der Graaf, Pieter Ballon

Abstract:

Humans do not behave rationally. We are emotional, easily influenced by others, as well as by our context. The study of human behaviour became a supreme endeavour within many academic disciplines, including economics, sociology, and clinical and social psychology. Understanding what motivates humans and triggers them to perform certain activities, and what it takes to change their behaviour, is central both for researchers and companies, as well as policy makers to implement efficient public policies. While numerous theoretical approaches for diverse domains such as health, retail, environment have been developed, the methodological models guiding the evaluation of such research have reached for a long time their limits. Within this context, digitisation, the Information and communication technologies (ICT) and wearable, the Internet of Things (IoT) connecting networks of devices, and new possibilities to collect and analyse massive amounts of data made it possible to study behaviour from a realistic perspective, as never before. Digital technologies make it possible to (1) capture data in real-life settings, (2) regain control over data by capturing the context of behaviour, and (3) analyse huge set of information through continuous measurement. Within this complex context, this paper describes a new framework for initiating behavioural change, capitalising on the digital developments in applied research projects and applicable both to academia, enterprises and policy makers. By applying this model, behavioural research can be conducted to address the issues of different domains, such as mobility, environment, health or media. The Modular Behavioural Analysis Approach (MBAA) is here described and firstly validated through a concrete use case within the domain of health. The results gathered have proven that disclosing information about health in connection with the use of digital apps for health, can be a leverage for changing behaviour, but it is only a first component requiring further follow-up actions. To this end, a clear definition of different 'behavioural profiles', towards which addressing several typologies of interventions, it is essential to effectively enable behavioural change. In the refined version of the MBAA a strong focus will rely on defining a methodology for shaping 'behavioural profiles' and related interventions, as well as the evaluation of side-effects on the creation of new business models and sustainability plans.

Keywords: behavioural change, framework, health, nudging, sustainability

Procedia PDF Downloads 216
26607 Making the Invisible Visible: Exploring Immersion Teacher Perceptions of Online Content and Language Integrated Learning Professional Development Experiences

Authors: T. J. O Ceallaigh

Abstract:

Subject matter driven programs such as immersion programs are increasingly popular across the world. These programs have allowed for extensive experimentation in the realm of second language teaching and learning and have been at the centre of many research agendas since their inception. Even though immersion programs are successful, especially in terms of second language development, they remain complex to implement and not always as successful as what we would hope them to be. Among all the challenges these varied programs face, research indicates that the primary issue lies in the difficulty to create well-balanced programs where both content instruction and language/literacy instruction can be targeted simultaneously. Initial teacher education and professional development experiences are key drivers of successful language immersion education globally. They are critical to the supply of teachers with the mandatory linguistic and cultural competencies as well as associated pedagogical practices required to ensure learners’ success. However, there is a significant dearth of research on professional development experiences of immersion teachers. We lack an understanding of the nature of their expertise and their needs in terms of professional development as well as their perceptions of the primary challenges they face as they attempt to formulate a coherent pedagogy of integrated language and content instruction. Such an understanding is essential if their specific needs are to be addressed appropriately and thus improve the overall quality of immersion programs. This paper reports on immersion teacher perceptions of online professional development experiences that have a positive impact on their ability to facilitate language and content connections in instruction. Twenty Irish-medium immersion teachers engaged in the instructional integration of language and content in a systematic and developmental way during a year-long online professional development program. Data were collected from a variety of sources e.g., an extensive online questionnaire, individual interviews, reflections, assignments and focus groups. This study provides compelling evidence of the potential of online professional development experiences as a pedagogical framework for understanding the complex and interconnected knowledge demands that arise in content and language integration in immersion. Findings illustrate several points of access to classroom research and pedagogy and uncover core aspects of high impact online experiences. Teachers identified aspects such as experimentation and risk-taking, authenticity and relevance, collegiality and collaboration, motivation and challenge and teacher empowerment. The potential of the online experiences to foster teacher language awareness was also identified as a contributory factor to success. The paper will conclude with implications for designing meaningful and effective online CLIL professional development experiences.

Keywords: content and language integrated learning , immersion pedagogy, professional development, teacher language awareness

Procedia PDF Downloads 178
26606 Simulation of a Cost Model Response Requests for Replication in Data Grid Environment

Authors: Kaddi Mohammed, A. Benatiallah, D. Benatiallah

Abstract:

Data grid is a technology that has full emergence of new challenges, such as the heterogeneity and availability of various resources and geographically distributed, fast data access, minimizing latency and fault tolerance. Researchers interested in this technology address the problems of the various systems related to the industry such as task scheduling, load balancing and replication. The latter is an effective solution to achieve good performance in terms of data access and grid resources and better availability of data cost. In a system with duplication, a coherence protocol is used to impose some degree of synchronization between the various copies and impose some order on updates. In this project, we present an approach for placing replicas to minimize the cost of response of requests to read or write, and we implement our model in a simulation environment. The placement techniques are based on a cost model which depends on several factors, such as bandwidth, data size and storage nodes.

Keywords: response time, query, consistency, bandwidth, storage capacity, CERN

Procedia PDF Downloads 266
26605 Dynamic Modeling of Advanced Wastewater Treatment Plants Using BioWin

Authors: Komal Rathore, Aydin Sunol, Gita Iranipour, Luke Mulford

Abstract:

Advanced wastewater treatment plants have complex biological kinetics, time variant influent flow rates and long processing times. Due to these factors, the modeling and operational control of advanced wastewater treatment plants become complicated. However, development of a robust model for advanced wastewater treatment plants has become necessary in order to increase the efficiency of the plants, reduce energy costs and meet the discharge limits set by the government. A dynamic model was designed using the Envirosim (Canada) platform software called BioWin for several wastewater treatment plants in Hillsborough County, Florida. Proper control strategies for various parameters such as mixed liquor suspended solids, recycle activated sludge and waste activated sludge were developed for models to match the plant performance. The models were tuned using both the influent and effluent data from the plant and their laboratories. The plant SCADA was used to predict the influent wastewater rates and concentration profiles as a function of time. The kinetic parameters were tuned based on sensitivity analysis and trial and error methods. The dynamic models were validated by using experimental data for influent and effluent parameters. The dissolved oxygen measurements were taken to validate the model by coupling them with Computational Fluid Dynamics (CFD) models. The Biowin models were able to exactly mimic the plant performance and predict effluent behavior for extended periods. The models are useful for plant engineers and operators as they can take decisions beforehand by predicting the plant performance with the use of BioWin models. One of the important findings from the model was the effects of recycle and wastage ratios on the mixed liquor suspended solids. The model was also useful in determining the significant kinetic parameters for biological wastewater treatment systems.

Keywords: BioWin, kinetic modeling, flowsheet simulation, dynamic modeling

Procedia PDF Downloads 148
26604 Prompt Design for Code Generation in Data Analysis Using Large Language Models

Authors: Lu Song Ma Li Zhi

Abstract:

With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.

Keywords: large language models, prompt design, data analysis, code generation

Procedia PDF Downloads 23
26603 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece

Authors: N. Samarinas, C. Evangelides, C. Vrekos

Abstract:

The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.

Keywords: classification, fuzzy logic, tolerance relations, rainfall data

Procedia PDF Downloads 311
26602 The Effects of Consumer Inertia and Emotions on New Technology Acceptance

Authors: Chyi Jaw

Abstract:

Prior literature on innovation diffusion or acceptance has almost exclusively concentrated on consumers’ positive attitudes and behaviors for new products/services. Consumers’ negative attitudes or behaviors to innovations have received relatively little marketing attention, but it happens frequently in practice. This study discusses consumer psychological factors when they try to learn or use new technologies. According to recent research, technological innovation acceptance has been considered as a dynamic or mediated process. This research argues that consumers can experience inertia and emotions in the initial use of new technologies. However, given such consumer psychology, the argument can be made as to whether the inclusion of consumer inertia (routine seeking and cognitive rigidity) and emotions increases the predictive power of new technology acceptance model. As data from the empirical study find, the process is potentially consumer emotion changing (independent of performance benefits) because of technology complexity and consumer inertia, and impact innovative technology use significantly. Finally, the study presents the superior predictability of the hypothesized model, which let managers can better predict and influence the successful diffusion of complex technological innovations.

Keywords: cognitive rigidity, consumer emotions, new technology acceptance, routine seeking, technology complexity

Procedia PDF Downloads 289
26601 Reinforcement Learning for Robust Missile Autopilot Design: TRPO Enhanced by Schedule Experience Replay

Authors: Bernardo Cortez, Florian Peter, Thomas Lausenhammer, Paulo Oliveira

Abstract:

Designing missiles’ autopilot controllers have been a complex task, given the extensive flight envelope and the nonlinear flight dynamics. A solution that can excel both in nominal performance and in robustness to uncertainties is still to be found. While Control Theory often debouches into parameters’ scheduling procedures, Reinforcement Learning has presented interesting results in ever more complex tasks, going from videogames to robotic tasks with continuous action domains. However, it still lacks clearer insights on how to find adequate reward functions and exploration strategies. To the best of our knowledge, this work is a pioneer in proposing Reinforcement Learning as a framework for flight control. In fact, it aims at training a model-free agent that can control the longitudinal non-linear flight dynamics of a missile, achieving the target performance and robustness to uncertainties. To that end, under TRPO’s methodology, the collected experience is augmented according to HER, stored in a replay buffer and sampled according to its significance. Not only does this work enhance the concept of prioritized experience replay into BPER, but it also reformulates HER, activating them both only when the training progress converges to suboptimal policies, in what is proposed as the SER methodology. The results show that it is possible both to achieve the target performance and to improve the agent’s robustness to uncertainties (with low damage on nominal performance) by further training it in non-nominal environments, therefore validating the proposed approach and encouraging future research in this field.

Keywords: Reinforcement Learning, flight control, HER, missile autopilot, TRPO

Procedia PDF Downloads 258
26600 Insights Into Serotonin-Receptor Binding and Stability via Molecular Dynamics Simulations: Key Residues for Electrostatic Interactions and Signal Transduction

Authors: Arunima Verma, Padmabati Mondal

Abstract:

Serotonin-receptor binding plays a key role in several neurological and biological processes, including mood, sleep, hunger, cognition, learning, and memory. In this article, we performed molecular dynamics simulation to examine the key residues that play an essential role in the binding of serotonin to the G-protein-coupled 5-HT₁ᴮ receptor (5-HT₁ᴮ R) via electrostatic interactions. An end-point free energy calculation method (MM-PBSA) determines the stability of the 5-HT1B R due to serotonin binding. The single-point mutation of the polar or charged amino acid residues (Asp129, Thr134) on the binding sites and the calculation of binding free energy validate the importance of these residues in the stability of the serotonin-receptor complex. Principal component analysis indicates the serotonin-bound 5-HT1BR is more stabilized than the apo-receptor in terms of dynamical changes. The difference dynamic cross-correlations map shows the correlation between the transmembrane and mini-Go, which indicates signal transduction happening between mini-Go and the receptor. Allosteric communication reveals the key nodes for signal transduction in 5-HT1BR. These results provide useful insights into the signal transduction pathways and mutagenesis study to regulate the functionality of the complex. The developed protocols can be applied to study local non-covalent interactions and long-range allosteric communications in any protein-ligand system for computer-aided drug design.

Keywords: allostery, CADD, MD simulations, MM-PBSA

Procedia PDF Downloads 80
26599 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development

Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas

Abstract:

One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.

Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development

Procedia PDF Downloads 313
26598 Methodology of Automation and Supervisory Control and Data Acquisition for Restructuring Industrial Systems

Authors: Lakhoua Najeh

Abstract:

Introduction: In most situations, an industrial system already existing, conditioned by its history, its culture and its context are in difficulty facing the necessity to restructure itself in an organizational and technological environment in perpetual evolution. This is why all operations of restructuring first of all require a diagnosis based on a functional analysis. After a presentation of the functionality of a supervisory system for complex processes, we present the concepts of industrial automation and supervisory control and data acquisition (SCADA). Methods: This global analysis exploits the various available documents on the one hand and takes on the other hand in consideration the various testimonies through investigations, the interviews or the collective workshops; otherwise, it also takes observations through visits as a basis and even of the specific operations. The exploitation of this diagnosis enables us to elaborate the project of restructuring thereafter. Leaving from the system analysis for the restructuring of industrial systems, and after a technical diagnosis based on visits, an analysis of the various technical documents and management as well as on targeted interviews, a focusing retailing the various levels of analysis has been done according a general methodology. Results: The methodology adopted in order to contribute to the restructuring of industrial systems by its participative and systemic character and leaning on a large consultation a lot of human resources that of the documentary resources, various innovating actions has been proposed. These actions appear in the setting of the TQM gait requiring applicable parameter quantification and a treatment valorising some information. The new management environment will enable us to institute an information and communication system possibility of migration toward an ERP system. Conclusion: Technological advancements in process monitoring, control and industrial automation over the past decades have contributed greatly to improve the productivity of virtually all industrial systems throughout the world. This paper tries to identify the principles characteristics of a process monitoring, control and industrial automation in order to provide tools to help in the decision-making process.

Keywords: automation, supervision, SCADA, TQM

Procedia PDF Downloads 168
26597 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction

Authors: S. Anastasiou, C. Nathanailides

Abstract:

The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services were gathered from relevant published works which included data from five different countries. The reviewed data indicate a significant correlation between indicators of customer and employee satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05) The reviewed data provide evidence that there is some practical evidence which links these two parameters.

Keywords: job satisfaction, job performance, customer’ service, banks, human resources management

Procedia PDF Downloads 317
26596 Terrorist Financing through Ilegal Fintech Hacking: Case Study of Rizki Gunawan

Authors: Ishna Indika Jusi, Rifana Meika

Abstract:

Terrorism financing method in Indonesia is developing at an alarming rate, to the point, it is now becoming more complex than before. Terrorists traditionally use conventional methods like robberies, charities, and courier services to fund their activities; today terrorists are able to utilize modern methods in financing their activities due to the rapid development in financial technology nowadays; one example is by hacking an illegal Fintech Company. Therefore, this research is conducted in order to explain and analyze the consideration behind the usage of an illegal fintech company to finance terrorism activities and how to prevent it. The analysis in this research is done by using the theory that is coined by Michael Freeman about the reasoning of terrorists when choosing their financing method. The method used in this research is a case study, and the case that is used for this research is the terrorism financing hacking of speedline.com in 2011 by Rizki Gunawan. Research data are acquired from interviews with the perpetrators, experts from INTRAC (PPATK), Special Detachment 88, reports, and journals that are relevant to the research. As a result, this study found that the priority aspects in terms of terrorist financing are security, quantity, and simplicity while obtaining funds.

Keywords: Fintech, illegal, Indonesia, technology, terrorism financing

Procedia PDF Downloads 167
26595 Evaluation of Australian Open Banking Regulation: Balancing Customer Data Privacy and Innovation

Authors: Suman Podder

Abstract:

As Australian ‘Open Banking’ allows customers to share their financial data with accredited Third-Party Providers (‘TPPs’), it is necessary to evaluate whether the regulators have achieved the balance between protecting customer data privacy and promoting data-related innovation. Recognising the need to increase customers’ influence on their own data, and the benefits of data-related innovation, the Australian Government introduced ‘Consumer Data Right’ (‘CDR’) to the banking sector through Open Banking regulation. Under Open Banking, TPPs can access customers’ banking data that allows the TPPs to tailor their products and services to meet customer needs at a more competitive price. This facilitated access and use of customer data will promote innovation by providing opportunities for new products and business models to emerge and grow. However, the success of Open Banking depends on the willingness of the customers to share their data, so the regulators have augmented the protection of data by introducing new privacy safeguards to instill confidence and trust in the system. The dilemma in policymaking is that, on the one hand, lenient data privacy laws will help the flow of information, but at the risk of individuals’ loss of privacy, on the other hand, stringent laws that adequately protect privacy may dissuade innovation. Using theoretical and doctrinal methods, this paper examines whether the privacy safeguards under Open Banking will add to the compliance burden of the participating financial institutions, resulting in the undesirable effect of stifling other policy objectives such as innovation. The contribution of this research is three-fold. In the emerging field of customer data sharing, this research is one of the few academic studies on the objectives and impact of Open Banking in the Australian context. Additionally, Open Banking is still in the early stages of implementation, so this research traces the evolution of Open Banking through policy debates regarding the desirability of customer data-sharing. Finally, the research focuses not only on the customers’ data privacy and juxtaposes it with another important objective of promoting innovation, but it also highlights the critical issues facing the data-sharing regime. This paper argues that while it is challenging to develop a regulatory framework for protecting data privacy without impeding innovation and jeopardising yet unknown opportunities, data privacy and innovation promote different aspects of customer welfare. This paper concludes that if a regulation is appropriately designed and implemented, the benefits of data-sharing will outweigh the cost of compliance with the CDR.

Keywords: consumer data right, innovation, open banking, privacy safeguards

Procedia PDF Downloads 138
26594 Generation of Automated Alarms for Plantwide Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.

Keywords: detection, monitoring, process data, noise

Procedia PDF Downloads 244
26593 Meanings and Concepts of Standardization in Systems Medicine

Authors: Imme Petersen, Wiebke Sick, Regine Kollek

Abstract:

In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.

Keywords: data, science and technology studies (STS), standardization, systems medicine

Procedia PDF Downloads 334
26592 Ambulatory Care Utilization of Individuals with Cerebral Palsy in Taiwan- A Country with Universal Coverage and No Gatekeeper Regulation

Authors: Ming-Juei Chang, Hui-Ing Ma, Tsung-Hsueh Lu

Abstract:

Introduction: Because of the advance of medical care (e.g., ventilation techniques and gastrostomy feeding), more and more children with CP can live to adulthood. However, little is known about the use of health care services from children to adults who have CP. The patterns of utilization of ambulatory care are heavily influenced by insurance coverage and primary care gatekeeper regulation. The purpose of this study was to examine patterns of ambulatory care utilization among individuals with CP in Taiwan, a country with universal coverage and no gatekeeper regulation. Methods: A representative sample of one million patients (about 1/23 of total population) covered by Taiwan’s National Health Insurance was used to analyze the ambulatory care utilization in individuals with CP. Data were analyzed by 3 different age groups (children, youth and adults) during 2000 to 2003. Participants were identified by the presence of CP diagnosis made by pediatricians or physicians of physical and rehabilitation medicine and stated at least three times in claims data. Results: Annual rates of outpatient physician visits were 31680 for children, 16492 for youth, and 28617 for adults with CP (per 1000 persons). Individuals with CP received over 50% of their outpatient care from hospital outpatient department. Higher use of specialist physician services was found in children (54.7%) than in the other two age groups (28.4% in youth and 18.8% in adults). Diseases of respiratory system were the most frequent diagnoses for visits in both children and youth with CP. Diseases of the circulatory system were the main reasons (24.3%) that adults with CP visited hospital outpatient care department or clinics. Conclusion: This study showed different patterns of ambulatory care utilization among different age groups. It appears that youth and adults with CP continue to have complex health issues and rely heavily on the health care system. Additional studies are needed to determine the factors which influence ambulatory care utilization among individuals with CP.

Keywords: cerebral palsy, health services, lifespan, universal coverage

Procedia PDF Downloads 373
26591 Integrated On-Board Diagnostic-II and Direct Controller Area Network Access for Vehicle Monitoring System

Authors: Kavian Khosravinia, Mohd Khair Hassan, Ribhan Zafira Abdul Rahman, Syed Abdul Rahman Al-Haddad

Abstract:

The CAN (controller area network) bus is introduced as a multi-master, message broadcast system. The messages sent on the CAN are used to communicate state information, referred as a signal between different ECUs, which provides data consistency in every node of the system. OBD-II Dongles that are based on request and response method is the wide-spread solution for extracting sensor data from cars among researchers. Unfortunately, most of the past researches do not consider resolution and quantity of their input data extracted through OBD-II technology. The maximum feasible scan rate is only 9 queries per second which provide 8 data points per second with using ELM327 as well-known OBD-II dongle. This study aims to develop and design a programmable, and latency-sensitive vehicle data acquisition system that improves the modularity and flexibility to extract exact, trustworthy, and fresh car sensor data with higher frequency rates. Furthermore, the researcher must break apart, thoroughly inspect, and observe the internal network of the vehicle, which may cause severe damages to the expensive ECUs of the vehicle due to intrinsic vulnerabilities of the CAN bus during initial research. Desired sensors data were collected from various vehicles utilizing Raspberry Pi3 as computing and processing unit with using OBD (request-response) and direct CAN method at the same time. Two types of data were collected for this study. The first, CAN bus frame data that illustrates data collected for each line of hex data sent from an ECU and the second type is the OBD data that represents some limited data that is requested from ECU under standard condition. The proposed system is reconfigurable, human-readable and multi-task telematics device that can be fitted into any vehicle with minimum effort and minimum time lag in the data extraction process. The standard operational procedure experimental vehicle network test bench is developed and can be used for future vehicle network testing experiment.

Keywords: CAN bus, OBD-II, vehicle data acquisition, connected cars, telemetry, Raspberry Pi3

Procedia PDF Downloads 195
26590 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 124
26589 Occurrence and Habitat Status of Osmoderma barnabita in Lithuania

Authors: D. Augutis, M. Balalaikins, D. Bastyte, R. Ferenca, A. Gintaras, R. Karpuska, G. Svitra, U. Valainis

Abstract:

Osmoderma species complex (consisting of Osmoderma eremita, O. barnabita, O. lassallei and O. cristinae) is a scarab beetle serving as indicator species in nature conservation. Osmoderma inhabits cavities containing sufficient volume of wood mould usually caused by brown rot in veteran deciduous trees. As the species, having high demands for the habitat quality, they indicate the suitability of the habitat for a number of other specialized saproxylic species. Since typical habitat needed for Osmoderma and other species associated with hollow veteran trees is rapidly declining, the species complex is protected under various legislation, such as Bern Convention, EU Habitats Directive and the Red Lists of many European states. Natura 2000 sites are the main tool for conservation of O. barnabita in Lithuania, currently 17 Natura 2000 sites are designated for the species, where monitoring is implemented once in 3 years according to the approved methodologies. Despite these monitoring efforts in species reports, provided to EU according to the Article 17 of the Habitats Directive, it is defined on the national level, that overall assessment of O. barnabita is inadequate and future prospects are poor. Therefore, research on the distribution and habitat status of O. barnabita was launched on the national level in 2016, which was complemented by preparatory actions of LIFE OSMODERMA project. The research was implemented in the areas equally distributed in the whole area of Lithuania, where O. barnabita was previously not observed, or not observed in the last 10 years. 90 areas, such as Habitats of European importance (9070 Fennoscandian wooded pastures, 9180 Tilio-Acerion forests of slopes, screes, and ravines), Woodland key habitats (B1 broad-leaved forest, K1 single giant tree) and old manor parks, were chosen for the research after review of habitat data from the existing national databases. The first part of field inventory of the habitats was carried out in 2016 and 2017 autumn and winter seasons, when relative abundance of O. barnabita was estimated according to larval faecal pellets in the tree cavities or around the trees. The state of habitats was evaluated according to the density of suitable and potential trees, percentage of not overshadowed trees and amount of undergrowth. The second part of the field inventory was carried out in the summer with pheromone traps baited with (R)-(+)-γ –decalactone. Results of the research show not only occurrence and habitat status of O. barnabita, but also help to clarify O. barnabita habitat requirements in Lithuania, define habitat size, its structure and distribution. Also, it compares habitat needs between the regions in Lithuania and inside and outside Natura 2000 areas designated for the species.

Keywords: habitat status, insect conservation, Osmoderma barnabita, veteran trees

Procedia PDF Downloads 135
26588 Study and Solving High Complex Non-Linear Differential Equations Applied in the Engineering Field by Analytical New Approach AGM

Authors: Mohammadreza Akbari, Sara Akbari, Davood Domiri Ganji, Pooya Solimani, Reza Khalili

Abstract:

In this paper, three complicated nonlinear differential equations(PDE,ODE) in the field of engineering and non-vibration have been analyzed and solved completely by new method that we have named it Akbari-Ganji's Method (AGM) . As regards the previous published papers, investigating this kind of equations is a very hard task to do and the obtained solution is not accurate and reliable. This issue will be emerged after comparing the achieved solutions by Numerical Method. Based on the comparisons which have been made between the gained solutions by AGM and Numerical Method (Runge-Kutta 4th), it is possible to indicate that AGM can be successfully applied for various differential equations particularly for difficult ones. Furthermore, It is necessary to mention that a summary of the excellence of this method in comparison with the other approaches can be considered as follows: It is noteworthy that these results have been indicated that this approach is very effective and easy therefore it can be applied for other kinds of nonlinear equations, And also the reasons of selecting the mentioned method for solving differential equations in a wide variety of fields not only in vibrations but also in different fields of sciences such as fluid mechanics, solid mechanics, chemical engineering, etc. Therefore, a solution with high precision will be acquired. With regard to the afore-mentioned explanations, the process of solving nonlinear equation(s) will be very easy and convenient in comparison with the other methods. And also one of the important position that is explored in this paper is: Trigonometric and exponential terms in the differential equation (the method AGM) , is no need to use Taylor series Expansion to enhance the precision of the result.

Keywords: new method (AGM), complex non-linear partial differential equations, damping ratio, energy lost per cycle

Procedia PDF Downloads 461
26587 Stereotypical Motor Movement Recognition Using Microsoft Kinect with Artificial Neural Network

Authors: M. Jazouli, S. Elhoufi, A. Majda, A. Zarghili, R. Aalouane

Abstract:

Autism spectrum disorder is a complex developmental disability. It is defined by a certain set of behaviors. Persons with Autism Spectrum Disorders (ASD) frequently engage in stereotyped and repetitive motor movements. The objective of this article is to propose a method to automatically detect this unusual behavior. Our study provides a clinical tool which facilitates for doctors the diagnosis of ASD. We focus on automatic identification of five repetitive gestures among autistic children in real time: body rocking, hand flapping, fingers flapping, hand on the face and hands behind back. In this paper, we present a gesture recognition system for children with autism, which consists of three modules: model-based movement tracking, feature extraction, and gesture recognition using artificial neural network (ANN). The first one uses the Microsoft Kinect sensor, the second one chooses points of interest from the 3D skeleton to characterize the gestures, and the last one proposes a neural connectionist model to perform the supervised classification of data. The experimental results show that our system can achieve above 93.3% recognition rate.

Keywords: ASD, artificial neural network, kinect, stereotypical motor movements

Procedia PDF Downloads 302
26586 Modelling Ibuprofen with Human Albumin

Authors: U. L. Fulco, E. L. Albuquerque, José X. Lima Neto, L. R. Da Silva

Abstract:

The binding of the nonsteroidal anti-inflammatory drug ibuprofen (IBU) to human serum albumin (HSA) is investigated using density functional theory (DFT) calculations within a fragmentation strategy. Crystallographic data for the IBU–HSA supramolecular complex shows that the ligand is confined to a large cavity at the subdomain IIIA and at the interface between the subdomains IIA and IIB, whose binding sites are FA3/FA4 and FA6, respectively. The interaction energy between the IBU molecule and each amino acid residue of these HSA binding pockets was calculated using the Molecular Fractionation with Conjugate Caps (MFCC) approach employing a dispersion corrected exchange–correlation functional. Our investigation shows that the total interaction energy of IBU bound to HSA at binding sites of the fatty acids FA3/FA4 (FA6) converges only for a pocket radius of at least 8.5 °A, mainly due to the action of residues Arg410, Lys414 and Ser489 (Lys351, Ser480 and Leu481) and residues in nonhydrophobic domains, namely Ile388, Phe395, Phe403, Leu407, Leu430, Val433, and Leu453 (Phe206, Ala210, Ala213, and Leu327), which is unusual. Our simulations are valuable for a better understanding of the binding mechanism of IBU to albumin and can lead to the rational design and the development of novel IBU-derived drugs with improved potency.

Keywords: ibuprofen, human serum albumin, density functional theory, binding energies

Procedia PDF Downloads 342
26585 Food Traceability for Small and Medium Enterprises Using Blockchain Technology

Authors: Amit Kohli, Pooja Lekhi, Gihan Adel Amin Hafez

Abstract:

Blockchain is a distributor ledger technology trend that extended to different fields and proved a remarkable success. Blockchain technology is a vital proliferation technique that recuperates the food supply chain traceability process. While tracing is the core of the food supply chain; still, a complex system mitigates the exceptional risk of food contamination, foodborne, food waste, and food fraud. In addition, the upsurge of food supply chain data variance and variety in the traceability system requires complete transparency, a secure, steadfast, sustainable, and efficient approach to face the food supply chain challenges. On the other hand, blockchain technical aspects merged with a detailed implementation plan, the advantages and challenges in food traceability have not been much elucidated for small and medium enterprises (SMEs.) This paper demonstrated the advantages and challenges of applying blockchain in SMEs combined with the success stories of firms implementing blockchain to cover the gap. Moreover, blockchain architecture in SMEs and how technology, organization, and environment frameworks can guarantee the success of blockchain implementation have been revealed.

Keywords: blockchain technology, small and medium enterprises, food traceability, blockchain architecture

Procedia PDF Downloads 181
26584 Achieving Quality of Life and Sustainability in Mexican Cities, the Case of the Housing Complex “Villa del Campo”, Tijuana, Mexico

Authors: María de los Ángeles Zárate López, Juan Antonio Pitones Rubio

Abstract:

Quality of life and sustainability in cities are among the most important challenges faced by designers, city planners and urban managers. The Mexican city of Tijuana has a particular dynamic in its demographics which has been accelerated by its border city condition, putting to the test the ability from authorities to provide the population with the necessary services to aspire for a deserving quality of life. In the recent story of Tijuana, we found that the housing policy and the solutions presented by private housing developers have not met the best living conditions for end users by far, thereby adding issues to current social problems which impact the whole metropolitan area, including damage to the natural environment. Therefore this research presents the case study about the situation of a suburban housing development near Tijuana named “Villa del Campo” and exposes the problems of this specific project (originally labelled as a “sustainable” proposal) demonstrating that, once built, the place does not reflect the quality of life that it promised as a project. Currently, this housing development has a number of problematic issues such as the faulty operating conditions of public utilities and serious cases of crime inside the neighborhood. There is no intention to only expose the negative side of this case study, but to explore some alternatives which could help solving the most serious problems at the place, considering possible architectural and landscape interventions within the housing complex to help achieve the optimal conditions of livability and sustainability required by their inhabitants.

Keywords: suburban, housing, quality of life, sustainability, Tijuana, demographics

Procedia PDF Downloads 381
26583 Minimum Data of a Speech Signal as Special Indicators of Identification in Phonoscopy

Authors: Nazaket Gazieva

Abstract:

Voice biometric data associated with physiological, psychological and other factors are widely used in forensic phonoscopy. There are various methods for identifying and verifying a person by voice. This article explores the minimum speech signal data as individual parameters of a speech signal. Monozygotic twins are believed to be genetically identical. Using the minimum data of the speech signal, we came to the conclusion that the voice imprint of monozygotic twins is individual. According to the conclusion of the experiment, we can conclude that the minimum indicators of the speech signal are more stable and reliable for phonoscopic examinations.

Keywords: phonogram, speech signal, temporal characteristics, fundamental frequency, biometric fingerprints

Procedia PDF Downloads 137
26582 Process Evaluation for a Trienzymatic System

Authors: C. Müller, T. Ortmann, S. Scholl, H. J. Jördening

Abstract:

Multienzymatic catalysis can be used as an alternative to chemical synthesis or hydrolysis of polysaccharides for the production of high value oligosaccharides from cheap resources such as sucrose. However, development of multienzymatic processes is complex, especially with respect to suitable conditions for enzymes originating from different organisms. Furthermore, an optimal configuration of the catalysts in a reaction cascade has to be found. These challenges can be approached by design of experiments. The system investigated in this study is a trienzymatic catalyzed reaction which results in laminaribiose production from sucrose and comprises covalently immobilized sucrose phosphorylase (SP), glucose isomerase (GI) and laminaribiose phosphorylase (LP). Operational windows determined with design of experiments and kinetic data of the enzymes were used to optimize the enzyme ratio for maximum product formation and minimal production of byproducts. After adjustment of the enzyme activity ratio to 1: 1.74: 2.23 (SP: LP: GI), different process options were investigated in silico. The considered options included substrate dependency, the use of glucose as co-substrate and substitution of glucose isomerase by glucose addition. Modeling of batch operation in a stirred tank reactor led to yields of 44.4% whereas operation in a continuous stirred tank reactor resulted in product yields of 22.5%. The maximum yield in a bienzymatic system comprised of sucrose phosphorylase and laminaribiose phosphorylase was 67.7% with sucrose and different amounts of glucose as substrate. The experimental data was in good compliance with the process model for batch operation. The continuous operation will be investigated in further studies. Simulation of operational process possibilities enabled us to compare various operational modes regarding different aspects such as cost efficiency, with the minimum amount of expensive and time-consuming practical experiments. This gives us more flexibility in process implementation and allows us, for example, to change the production goal from laminaribiose to higher oligosaccharides.

Keywords: design of experiments, enzyme kinetics, multi-enzymatic system, in silico process development

Procedia PDF Downloads 329