Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28307

Search results for: raw complex data

27257 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL

Procedia PDF Downloads 352
27256 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser

Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett

Abstract:

Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.

Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser

Procedia PDF Downloads 151
27255 Data Recording for Remote Monitoring of Autonomous Vehicles

Authors: Rong-Terng Juang

Abstract:

Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.

Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar

Procedia PDF Downloads 157
27254 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE

Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao

Abstract:

For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.

Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE

Procedia PDF Downloads 171
27253 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory

Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan

Abstract:

Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.

Keywords: data fusion, Dempster-Shafer theory, data mining, event detection

Procedia PDF Downloads 407
27252 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 123
27251 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems

Authors: Yong-Kyu Jung

Abstract:

The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).

Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity

Procedia PDF Downloads 75
27250 Mesoporous Na2Ti3O7 Nanotube-Constructed Materials with Hierarchical Architecture: Synthesis and Properties

Authors: Neumoin Anton Ivanovich, Opra Denis Pavlovich

Abstract:

Materials based on titanium oxide compounds are widely used in such areas as solar energy, photocatalysis, food industry and hygiene products, biomedical technologies, etc. Demand for them has also formed in the battery industry (an example of this is the commercialization of Li4Ti5O12), where much attention has recently been paid to the development of next-generation systems and technologies, such as sodium-ion batteries. This dictates the need to search for new materials with improved characteristics, as well as ways to obtain them that meet the requirements of scalability. One of the ways to solve these problems can be the creation of nanomaterials that often have a complex of physicochemical properties that radically differ from the characteristics of their counterparts in the micro- or macroscopic state. At the same time, it is important to control the texture (specific surface area, porosity) of such materials. In view of the above, among other methods, the hydrothermal technique seems to be suitable, allowing a wide range of control over the conditions of synthesis. In the present study, a method was developed for the preparation of mesoporous nanostructured sodium trititanate (Na2Ti3O7) with a hierarchical architecture. The materials were synthesized by hydrothermal processing and exhibit a complex hierarchically organized two-layer architecture. At the first level of the hierarchy, materials are represented by particles having a roughness surface, and at the second level, by one-dimensional nanotubes. The products were found to have high specific surface area and porosity with a narrow pore size distribution (about 6 nm). As it is known, the specific surface area and porosity are important characteristics of functional materials, which largely determine the possibilities and directions of their practical application. Electrochemical impedance spectroscopy data show that the resulting sodium trititanate has a sufficiently high electrical conductivity. As expected, the synthesized complexly organized nanoarchitecture based on sodium trititanate with a porous structure can be practically in demand, for example, in the field of new generation electrochemical storage and energy conversion devices.

Keywords: sodium trititanate, hierarchical materials, mesoporosity, nanotubes, hydrothermal synthesis

Procedia PDF Downloads 105
27249 Effects of Medium Composition on the Production of Biomass and a Carbohydrate Isomerase by a Novel Strain of Lactobacillus

Authors: M. Miriam Hernández-Arroyo, Ivonne Caro-Gonzales, Miguel Ángel Plascencia-Espinosa, Sergio R. Trejo-Estrada

Abstract:

A large biodiversity of Lactobacillus strains has been detected in traditional foods and beverages from Mexico. A selected strain of Lactobacillus sp - PODI-20, used for the obtained from an artisanal fermented beverage was cultivated in different carbon sources in a complex medium, in order to define which carbon sourced induced more effectively the isomerization of arabinose by cell fractions obtained by fermentation. Four different carbon sources were tested in a medium containing peptone and yeast extract and mineral salts. Glucose, galactose, arabinose, and lactose were tested individually at three different concentrations: 3.5, 6, and 10% w/v. The biomass yield ranged from 1.72 to 17.6 g/L. The cell pellet was processed by mechanical homogenization. Both fractions, the cellular debris, and the lysis supernatant were tested for their ability to isomerize arabinose into ribulose. The highest yield of isomer was 12 % of isomerization in the supernatant fractions; whereas up to 9.3% was obtained by the use of cell debris. The isomerization of arabinose has great significance in the production of lactic acid by fermentation of complex carbohydrate hydrolysates.

Keywords: isomerase, tagatose, aguamiel, isomerization

Procedia PDF Downloads 341
27248 The Study of Hydro Physical Complex Characteristic of Clay Soil-Ground of Colchis Lowland

Authors: Paata Sitchinava

Abstract:

It has been studied phenomena subjected on the water physical (hydrophysical, mineralogy containing, specific hydrophysical) class of heavy clay soils of the Colchis lowland, according to various categories and forms of the porous water, which will be the base of the distributed used methods of the engineering practice and reclamation effectiveness evaluation. According to of clay grounds data, it has been chosen three research bases section in the central part of lowland, where has implemented investigation works by using a special program. It has been established, that three of cuts are somewhat identical, and by morphological grounds separated layers are the difference by Gallic quality. It has been implemented suitable laboratory experimental research at the samples taken from the cuts, at the base of these created classification mark of physical-technical characteristic, which is the base of suitable calculation of hydrophysical researches.

Keywords: Colchis lowland, drainage, water, soil-ground

Procedia PDF Downloads 177
27247 3D Elasticity Analysis of Laminated Composite Plate Using State Space Method

Authors: Prathmesh Vikas Patil, Yashaswini Lomte Patil

Abstract:

Laminated composite materials have considerable attention in various engineering applications due to their exceptional strength-to-weight ratio and mechanical properties. The analysis of laminated composite plates in three-dimensional (3D) elasticity is a complex problem, as it requires accounting for the orthotropic anisotropic nature of the material and the interactions between multiple layers. Conventional approaches, such as the classical plate theory, provide simplified solutions but are limited in performing exact analysis of the plate. To address such a challenge, the state space method emerges as a powerful numerical technique for modeling the behavior of laminated composites in 3D. The state-space method involves transforming the governing equations of elasticity into a state-space representation, enabling the analysis of complex structural systems in a systematic manner. Here, an effort is made to perform a 3D elasticity analysis of plates with cross-ply and angle-ply laminates using the state space approach. The state space approach is used in this study as it is a mixed formulation technique that gives the displacements and stresses simultaneously with the same level of accuracy.

Keywords: cross ply laminates, angle ply laminates, state space method, three-dimensional elasticity analysis

Procedia PDF Downloads 100
27246 Algorithm for Predicting Cognitive Exertion and Cognitive Fatigue Using a Portable EEG Headset for Concussion Rehabilitation

Authors: Lou J. Pino, Mark Campbell, Matthew J. Kennedy, Ashleigh C. Kennedy

Abstract:

A concussion is complex and nuanced, with cognitive rest being a key component of recovery. Cognitive overexertion during rehabilitation from a concussion is associated with delayed recovery. However, daily living imposes cognitive demands that may be unavoidable and difficult to quantify. Therefore, a portable tool capable of alerting patients before cognitive overexertion occurs could allow patients to maintain their quality of life while preventing symptoms and recovery setbacks. EEG allows for a sensitive measure of cognitive exertion. Clinical 32-lead EEG headsets are not practical for day-to-day concussion rehabilitation management. However, there are now commercially available and affordable portable EEG headsets. Thus, these headsets can potentially be used to continuously monitor cognitive exertion during mental tasks to alert the wearer of overexertion, with the aim of preventing the occurrence of symptoms to speed recovery times. The objective of this study was to test an algorithm for predicting cognitive exertion from EEG data collected from a portable headset. EEG data were acquired from 10 participants (5 males, 5 females). Each participant wore a portable 4 channel EEG headband while completing 10 tasks: rest (eyes closed), rest (eyes open), three levels of the increasing difficulty of logic puzzles, three levels of increasing difficulty in multiplication questions, rest (eyes open), and rest (eyes closed). After each task, the participant was asked to report their perceived level of cognitive exertion using the NASA Task Load Index (TLX). Each participant then completed a second session on a different day. A customized machine learning model was created using data from the first session. The performance of each model was then tested using data from the second session. The mean correlation coefficient between TLX scores and predicted cognitive exertion was 0.75 ± 0.16. The results support the efficacy of the algorithm for predicting cognitive exertion. This demonstrates that the algorithms developed in this study used with portable EEG devices have the potential to aid in the concussion recovery process by monitoring and warning patients of cognitive overexertion. Preventing cognitive overexertion during recovery may reduce the number of symptoms a patient experiences and may help speed the recovery process.

Keywords: cognitive activity, EEG, machine learning, personalized recovery

Procedia PDF Downloads 216
27245 Use of Simultaneous Electron Backscatter Diffraction and Energy Dispersive X-Ray Spectroscopy Techniques to Characterize High-Temperature Oxides Formed on Nickel-Based Superalloys Exposed to Super-Critical Water Environment

Authors: Mohsen Sanayei, Jerzy Szpunar, Sami Penttilä

Abstract:

Exposure of Nickel-based superalloys to high temperature and harsh environment such as Super-Critical Water (SCW) environment leads to the formation of oxide scales composed of multiple and complex phases that are difficult to differentiate with conventional analysis techniques. In this study, we used simultaneous Electron Backscatter Diffraction (EBSD) and Energy Dispersive X-ray Spectroscopy (EDS) to analyze the complex oxide scales formed on several Nickel-based Superalloys exposed to high temperature SCW. Multi-layered structures of Iron, Nickel, Chromium and Molybdenum oxides and spinels were clearly identified using the simultaneous EBSD-EDS analysis technique. Furthermore, the orientation relationship between the oxide scales and the substrate has been investigated.

Keywords: electron backscatter diffraction, energy dispersive x-ray spectroscopy, superalloy, super-critical water

Procedia PDF Downloads 312
27244 Evaluating Antifungal Potential of Respiratory Inhibitors against Phyto-Pathogenic Fungi

Authors: Sehrish Iftikhar, Ahmad Ali Shahid, Kiran Nawaz, Waheed Anwar

Abstract:

Discovery and development of new compounds require intense studies in chemistry, biochemistry. Numerous experiments under laboratory-, greenhouse- and field conditions can be performed to select suitable candidates and to understand their full potential. Novel fungicides are fundamental to combat plant diseases. Fusarium solani is important plant pathogen. New broad spectrum foliar fungicides against complex II were designed in this study. Complex II, namely succinate dehydrogenase (SDH), or succinate quinone oxidoreductase (SQR) is a multi-subunit enzyme at the crossroads of TCA and ETC at the inner mitochondrial membrane. The need for new and innovative fungicides is driven by resistance management, regulatory hurdles and increasing customer expectations amongst others. Fungicidal activity was assessed for the effect on mycelial growth and spore germination of the fungi using fungicide amended media assay. In mycelial growth assay compounds C10 and C6 were highly active against all the isolates. The compounds C1 and C10 were found most potent in spore germination test. It fully proved that the SDHIs designed in this paper displayed as good inhibitory effects against Fusarium solani.

Keywords: Wilt, Fusarium, SDH, antifungal

Procedia PDF Downloads 254
27243 Efficient Semi-Systolic Finite Field Multiplier Using Redundant Basis

Authors: Hyun-Ho Lee, Kee-Won Kim

Abstract:

The arithmetic operations over GF(2m) have been extensively used in error correcting codes and public-key cryptography schemes. Finite field arithmetic includes addition, multiplication, division and inversion operations. Addition is very simple and can be implemented with an extremely simple circuit. The other operations are much more complex. The multiplication is the most important for cryptosystems, such as the elliptic curve cryptosystem, since computing exponentiation, division, and computing multiplicative inverse can be performed by computing multiplication iteratively. In this paper, we present a parallel computation algorithm that operates Montgomery multiplication over finite field using redundant basis. Also, based on the multiplication algorithm, we present an efficient semi-systolic multiplier over finite field. The multiplier has less space and time complexities compared to related multipliers. As compared to the corresponding existing structures, the multiplier saves at least 5% area, 50% time, and 53% area-time (AT) complexity. Accordingly, it is well suited for VLSI implementation and can be easily applied as a basic component for computing complex operations over finite field, such as inversion and division operation.

Keywords: finite field, Montgomery multiplication, systolic array, cryptography

Procedia PDF Downloads 289
27242 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 301
27241 Effectiveness of Integrative Behavioral Couples Therapy on the Communication Patterns of Couples Applying for Divorce

Authors: Sakineh Abbasi Bourondaragh

Abstract:

The aim of this research is effectiveness of integrative behavioral couples therapy on the communication patterns of couples applying for divorce. We selected (N=20) reports from Tabriz Family Judicial Complex (FJC) of couples which have conflict in their marital relationships. All of reports were released during 2012. First, they were randomly divided into two experimental and control groups and all the couples were given pre-test. They participated in twelve therapy sessions. Then the experimental group was exposed to an experimental intervention, but the control group was not received experimental intervention. The subjects were treated. At the end of treatment, a post-test was performed about subjects (each of two groups).The results showed that integrative behavioral couple therapy could increase and improve communication patterns. The findings also showed that integrative behavioral couples therapy had increased mutual constructive pattern and decreased demand/withdraw pattern and mutual avoidance pattern of CPQ sub-scale. Steady change indicator showed that the difference is clinically meaningful.

Keywords: integrative behavioral couple therapy, communication patterns, cognitive sciences, Family Judicial Complex

Procedia PDF Downloads 314
27240 Natural Language Processing; the Future of Clinical Record Management

Authors: Khaled M. Alhawiti

Abstract:

This paper investigates the future of medicine and the use of Natural language processing. The importance of having correct clinical information available online is remarkable; improving patient care at affordable costs could be achieved using automated applications to use the online clinical information. The major challenge towards the retrieval of such vital information is to have it appropriately coded. Majority of the online patient reports are not found to be coded and not accessible as its recorded in natural language text. The use of Natural Language processing provides a feasible solution by retrieving and organizing clinical information, available in text and transforming clinical data that is available for use. Systems used in NLP are rather complex to construct, as they entail considerable knowledge, however significant development has been made. Newly formed NLP systems have been tested and have established performance that is promising and considered as practical clinical applications.

Keywords: clinical information, information retrieval, natural language processing, automated applications

Procedia PDF Downloads 400
27239 Selecting the Best Software Product Using Analytic Hierarchy Process and Fuzzy-Analytic Hierarchy Process Modules

Authors: Anas Hourani, Batool Ahmad

Abstract:

Software applications play an important role inside any institute. They are employed to manage all processes and store entities-related data in the computer. Therefore, choosing the right software product that meets institute requirements is not an easy decision in view of considering multiple criteria, different points of views, and many standards. As a case study, Mutah University, located in Jordan, is in essential need of customized software, and several companies presented their software products which are very similar in quality. In this regard, an analytic hierarchy process (AHP) and a fuzzy analytic hierarchy process (Fuzzy-AHP) models are proposed in this research to identify the most suitable and best-fit software product that meets the institute requirements. The results indicate that both modules are able to help the decision-makers to make a decision, especially in complex decision problems.

Keywords: analytic hierarchy process, decision modeling, fuzzy analytic hierarchy process, software product

Procedia PDF Downloads 385
27238 Theory of Constraints: Approach for Performance Enhancement and Boosting Overhaul Activities

Authors: Sunil Dutta

Abstract:

Synchronization is defined as ‘the sequencing and re-sequencing of all relative and absolute activities in time and space and continuous alignment of those actions with purposeful objective in a complex and dynamic atmosphere. In a complex and dynamic production / maintenance setup, no single group can work in isolation for long. In addition, many activities in projects take place simultaneously at the same time. Work of every section / group is interwoven with work of others. The various activities / interactions which take place in production / overhaul workshops are interlinked because of physical requirements (information, material, workforces, equipment, and space) and dependencies. The activity sequencing is determined by physical dependencies of various department / sections / units (e.g., inventory availability must be ensured before stripping and disassembling of equipment), whereas resource dependencies do not. Theory of constraint facilitates identification, analyses and exploitation of the constraint in methodical manner. These constraints (equipment, manpower, policies etc.) prevent the department / sections / units from getting optimum exploitation of available resources. The significance of theory of constraints for achieving synchronization at overhaul workshop is illustrated in this paper.

Keywords: synchronization, overhaul, throughput, obsolescence, uncertainty

Procedia PDF Downloads 346
27237 Estimating Destinations of Bus Passengers Using Smart Card Data

Authors: Hasik Lee, Seung-Young Kho

Abstract:

Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.

Keywords: destination estimation, Kernel density estimation, smart card data, validation

Procedia PDF Downloads 347
27236 Comparative Analysis of Reinforcement Learning Algorithms for Autonomous Driving

Authors: Migena Mana, Ahmed Khalid Syed, Abdul Malik, Nikhil Cherian

Abstract:

In recent years, advancements in deep learning enabled researchers to tackle the problem of self-driving cars. Car companies use huge datasets to train their deep learning models to make autonomous cars a reality. However, this approach has certain drawbacks in that the state space of possible actions for a car is so huge that there cannot be a dataset for every possible road scenario. To overcome this problem, the concept of reinforcement learning (RL) is being investigated in this research. Since the problem of autonomous driving can be modeled in a simulation, it lends itself naturally to the domain of reinforcement learning. The advantage of this approach is that we can model different and complex road scenarios in a simulation without having to deploy in the real world. The autonomous agent can learn to drive by finding the optimal policy. This learned model can then be easily deployed in a real-world setting. In this project, we focus on three RL algorithms: Q-learning, Deep Deterministic Policy Gradient (DDPG), and Proximal Policy Optimization (PPO). To model the environment, we have used TORCS (The Open Racing Car Simulator), which provides us with a strong foundation to test our model. The inputs to the algorithms are the sensor data provided by the simulator such as velocity, distance from side pavement, etc. The outcome of this research project is a comparative analysis of these algorithms. Based on the comparison, the PPO algorithm gives the best results. When using PPO algorithm, the reward is greater, and the acceleration, steering angle and braking are more stable compared to the other algorithms, which means that the agent learns to drive in a better and more efficient way in this case. Additionally, we have come up with a dataset taken from the training of the agent with DDPG and PPO algorithms. It contains all the steps of the agent during one full training in the form: (all input values, acceleration, steering angle, break, loss, reward). This study can serve as a base for further complex road scenarios. Furthermore, it can be enlarged in the field of computer vision, using the images to find the best policy.

Keywords: autonomous driving, DDPG (deep deterministic policy gradient), PPO (proximal policy optimization), reinforcement learning

Procedia PDF Downloads 139
27235 The Role of the Russian as a Foreign Language (RFL) Textbook in the RFL System

Authors: Linda Torresin

Abstract:

This paper is devoted to the Russian as a Foreign Language (RFL) textbook, which is understood as a fundamental element of the RFL system. The aim of the study is to explore the role of the RFL textbook in modern RFL teaching theories and practices. It is suggested that the RFL textbook is not a secondary factor but contributes to the advancement and rewriting of both RFL theories and practices. This study applies to the RFL textbook theory's recent pedagogical developments in education. Therefore, the RFL system is conceived as a complex adaptive system whose elements (teacher, textbook, students, etc.) interact in a dynamic network of interconnections. In particular, the author shows that the textbook plays a central role in the RFL system since it may change and even renew RFL teaching from both theoretical and practical perspectives. On the one hand, in fact, the use of an RFL textbook may impact teaching theories: that is, the textbook may either consolidate preexisting theories or launch new approaches. On the other hand, the RFL textbook may also influence teaching practices by reinforcing the preexisting ones or encouraging teachers to try new strategies instead. All this allows the RFL textbook, within the RFL complex adaptive system, to exert an influence on the specific teaching contexts in which Russian is taught, interacting with the other elements of the system itself. Through its findings, this paper contributes to the advancement of research on RFL textbook theory.

Keywords: adaptive system, foreign language textbook, teaching Russian as a foreign language, textbook of Russian as a foreign language

Procedia PDF Downloads 89
27234 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 196
27233 Regional Changes under Extreme Meteorological Events

Authors: Renalda El Samra, Elie Bou-Zeid, Hamza Kunhu Bangalath, Georgiy Stenchikov, Mutasem El Fadel

Abstract:

The regional-scale impact of climate change over complex terrain was examined through high-resolution dynamic downscaling conducted using the Weather Research and Forecasting (WRF) model, with initial and boundary conditions from a High-Resolution Atmospheric Model (HiRAM). The analysis was conducted over the eastern Mediterranean, with a focus on the country of Lebanon, which is characterized by a challenging complex topography that magnifies the effect of orographic precipitation. Four year-long WRF simulations, selected based on HiRAM time series, were performed to generate future climate projections of extreme temperature and precipitation over the study area under the conditions of the Representative Concentration Pathway (RCP) 4.5. One past WRF simulation year, 2008, was selected as a baseline to capture dry extremes of the system. The results indicate that the study area might be exposed to a temperature increase between 1.0 and 3ºC in summer mean values by 2050, in comparison to 2008. For extreme years, the decrease in average annual precipitation may exceed 50% at certain locations in comparison to 2008.

Keywords: HiRAM, regional climate modeling, WRF, Representative Concentration Pathway (RCP)

Procedia PDF Downloads 391
27232 Evaluated Nuclear Data Based Photon Induced Nuclear Reaction Model of GEANT4

Authors: Jae Won Shin

Abstract:

We develop an evaluated nuclear data based photonuclear reaction model of GEANT4 for a more accurate simulation of photon-induced neutron production. The evaluated photonuclear data libraries from the ENDF/B-VII.1 are taken as input. Incident photon energies up to 140 MeV which is the threshold energy for the pion production are considered. For checking the validity of the use of the data-based model, we calculate the photoneutron production cross-sections and yields and compared them with experimental data. The results obtained from the developed model are found to be in good agreement with the experimental data for (γ,xn) reactions.

Keywords: ENDF/B-VII.1, GEANT4, photoneutron, photonuclear reaction

Procedia PDF Downloads 271
27231 Quality Control of Automotive Gearbox Based On Vibration Signal Analysis

Authors: Nilson Barbieri, Bruno Matos Martins, Gabriel de Sant'Anna Vitor Barbieri

Abstract:

In more complex systems, such as automotive gearbox, a rigorous treatment of the data is necessary because there are several moving parts (gears, bearings, shafts, etc.), and in this way, there are several possible sources of errors and also noise. The basic objective of this work is the detection of damage in automotive gearbox. The detection methods used are the wavelet method, the bispectrum; advanced filtering techniques (selective filtering) of vibrational signals and mathematical morphology. Gearbox vibration tests were performed (gearboxes in good condition and with defects) of a production line of a large vehicle assembler. The vibration signals are obtained using five accelerometers in different positions of the sample. The results obtained using the kurtosis, bispectrum, wavelet and mathematical morphology showed that it is possible to identify the existence of defects in automotive gearboxes.

Keywords: automotive gearbox, mathematical morphology, wavelet, bispectrum

Procedia PDF Downloads 469
27230 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams

Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem

Abstract:

In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.

Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data

Procedia PDF Downloads 156
27229 Obstacles to Innovation for SMEs: Evidence from Germany

Authors: Natalia Strobel, Jan Kratzer

Abstract:

Achieving effective innovation is a complex task and during this process firms (especially SMEs) often face obstacles. However, research into obstacles to innovation focusing on SMEs is very scarce. In this study, we propose a theoretical framework for describing these obstacles to innovation and investigate their influence on the innovative performance of SMEs. Data were collected in 2013 through face-to-face interviews with executives of 49 technology SMEs from Germany. The semi-structured interviews were designed on the basis of scales for measuring innovativeness, financial/competitive performance and obstacles to innovation, next to purely open questions. We find that the internal obstacles lack the know-how, capacity overloading, unclear roles and tasks, as well as the external obstacle governmental bureaucracy negatively influence the innovative performance of SMEs. However, in contrast to prior findings this study shows that cooperation ties of firms might also negatively influence the innovative performance.

Keywords: innovation, innovation process, obstacles, SME

Procedia PDF Downloads 348
27228 A Temporal QoS Ontology For ERTMS/ETCS

Authors: Marc Sango, Olimpia Hoinaru, Christophe Gransart, Laurence Duchien

Abstract:

Ontologies offer a means for representing and sharing information in many domains, particularly in complex domains. For example, it can be used for representing and sharing information of System Requirement Specification (SRS) of complex systems like the SRS of ERTMS/ETCS written in natural language. Since this system is a real-time and critical system, generic ontologies, such as OWL and generic ERTMS ontologies provide minimal support for modeling temporal information omnipresent in these SRS documents. To support the modeling of temporal information, one of the challenges is to enable representation of dynamic features evolving in time within a generic ontology with a minimal redesign of it. The separation of temporal information from other information can help to predict system runtime operation and to properly design and implement them. In addition, it is helpful to provide a reasoning and querying techniques to reason and query temporal information represented in the ontology in order to detect potential temporal inconsistencies. Indeed, a user operation, such as adding a new constraint on existing planning constraints can cause temporal inconsistencies, which can lead to system failures. To address this challenge, we propose a lightweight 3-layer temporal Quality of Service (QoS) ontology for representing, reasoning and querying over temporal and non-temporal information in a complex domain ontology. Representing QoS entities in separated layers can clarify the distinction between the non QoS entities and the QoS entities in an ontology. The upper generic layer of the proposed ontology provides an intuitive knowledge of domain components, specially ERTMS/ETCS components. The separation of the intermediate QoS layer from the lower QoS layer allows us to focus on specific QoS Characteristics, such as temporal or integrity characteristics. In this paper, we focus on temporal information that can be used to predict system runtime operation. To evaluate our approach, an example of the proposed domain ontology for handover operation, as well as a reasoning rule over temporal relations in this domain-specific ontology, are given.

Keywords: system requirement specification, ERTMS/ETCS, temporal ontologies, domain ontologies

Procedia PDF Downloads 414