Search results for: longitudinal data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25018

Search results for: longitudinal data

24448 Speed Characteristics of Mixed Traffic Flow on Urban Arterials

Authors: Ashish Dhamaniya, Satish Chandra

Abstract:

Speed and traffic volume data are collected on different sections of four lane and six lane roads in three metropolitan cities in India. Speed data are analyzed to fit the statistical distribution to individual vehicle speed data and all vehicles speed data. It is noted that speed data of individual vehicle generally follows a normal distribution but speed data of all vehicle combined at a section of urban road may or may not follow the normal distribution depending upon the composition of traffic stream. A new term Speed Spread Ratio (SSR) is introduced in this paper which is the ratio of difference in 85th and 50th percentile speed to the difference in 50th and 15th percentile speed. If SSR is unity then speed data are truly normally distributed. It is noted that on six lane urban roads, speed data follow a normal distribution only when SSR is in the range of 0.86 – 1.11. The range of SSR is validated on four lane roads also.

Keywords: normal distribution, percentile speed, speed spread ratio, traffic volume

Procedia PDF Downloads 407
24447 An Exploratory Analysis of Brisbane's Commuter Travel Patterns Using Smart Card Data

Authors: Ming Wei

Abstract:

Over the past two decades, Location Based Service (LBS) data have been increasingly applied to urban and transportation studies due to their comprehensiveness and consistency. However, compared to other LBS data including mobile phone data, GPS and social networking platforms, smart card data collected from public transport users have arguably yet to be fully exploited in urban systems analysis. By using five weekdays of passenger travel transaction data taken from go card – Southeast Queensland’s transit smart card – this paper analyses the spatiotemporal distribution of passenger movement with regard to the land use patterns in Brisbane. Work and residential places for public transport commuters were identified after extracting journeys-to-work patterns. Our results show that the locations of the workplaces identified from the go card data and residential suburbs are largely consistent with those that were marked in the land use map. However, the intensity for some residential locations in terms of population or commuter densities do not match well between the map and those derived from the go card data. This indicates that the misalignment between residential areas and workplaces to a certain extent, shedding light on how enhancements to service management and infrastructure expansion might be undertaken.

Keywords: big data, smart card data, travel pattern, land use

Procedia PDF Downloads 276
24446 Pattern Recognition Using Feature Based Die-Map Clustering in the Semiconductor Manufacturing Process

Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek

Abstract:

Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.

Keywords: die-map clustering, feature extraction, pattern recognition, semiconductor manufacturing process

Procedia PDF Downloads 391
24445 Dynamic Test and Numerical Analysis of Twin Tunnel

Authors: Changwon Kwak, Innjoon Park, Dongin Jang

Abstract:

Seismic load affects the behavior of underground structure like tunnel broadly. Seismic soil-structure interaction can play an important role in the dynamic behavior of tunnel. In this research, twin tunnel with flexible joint was physically modeled and the dynamic centrifuge test was performed to investigate seismic behavior of twin tunnel. Seismic waves have different frequency were exerted and the characteristics of response were obtained from the test. Test results demonstrated the amplification of peak acceleration in the longitudinal direction in seismic waves. The effect of the flexible joint was also verified. Additionally, 3-dimensional finite difference dynamic analysis was conducted and the analysis results exhibited good agreement with the test results.

Keywords: 3-dimensional finite difference dynamic analysis, dynamic centrifuge test, flexible joint, seismic soil-structure interaction

Procedia PDF Downloads 243
24444 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 209
24443 Single-Cell Visualization with Minimum Volume Embedding

Authors: Zhenqiu Liu

Abstract:

Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.

Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method

Procedia PDF Downloads 218
24442 Cloud Data Security Using Map/Reduce Implementation of Secret Sharing Schemes

Authors: Sara Ibn El Ahrache, Tajje-eddine Rachidi, Hassan Badir, Abderrahmane Sbihi

Abstract:

Recently, there has been increasing confidence for a favorable usage of big data drawn out from the huge amount of information deposited in a cloud computing system. Data kept on such systems can be retrieved through the network at the user’s convenience. However, the data that users send include private information, and therefore, information leakage from these data is now a major social problem. The usage of secret sharing schemes for cloud computing have lately been approved to be relevant in which users deal out their data to several servers. Notably, in a (k,n) threshold scheme, data security is assured if and only if all through the whole life of the secret the opponent cannot compromise more than k of the n servers. In fact, a number of secret sharing algorithms have been suggested to deal with these security issues. In this paper, we present a Mapreduce implementation of Shamir’s secret sharing scheme to increase its performance and to achieve optimal security for cloud data. Different tests were run and through it has been demonstrated the contributions of the proposed approach. These contributions are quite considerable in terms of both security and performance.

Keywords: cloud computing, data security, Mapreduce, Shamir's secret sharing

Procedia PDF Downloads 292
24441 Stationary Energy Partition between Waves in a Carbyne Chain

Authors: Svetlana Nikitenkova, Dmitry Kovriguine

Abstract:

Stationary energy partition between waves in a one dimensional carbyne chain at ambient temperatures is investigated. The study is carried out by standard asymptotic methods of nonlinear dynamics in the framework of classical mechanics, based on a simple mathematical model, taking into account central and noncentral interactions between carbon atoms. Within the first-order nonlinear approximation analysis, triple-mode resonant ensembles of quasi-harmonic waves are revealed. Any resonant triad consists of a single primary high-frequency longitudinal mode and a pair of secondary low-frequency transverse modes of oscillations. In general, the motion of the carbyne chain is described by a superposition of resonant triads of various spectral scales. It is found that the stationary energy distribution is obeyed to the classical Rayleigh–Jeans law, at the expense of the proportional amplitude dispersion, except a shift in the frequency band, upwards the spectrum.

Keywords: resonant triplet, Rayleigh–Jeans law, amplitude dispersion, carbyne

Procedia PDF Downloads 430
24440 The Effects of Qigong Exercise Intervention on the Cognitive Function in Aging Adults

Authors: D. Y. Fong, C. Y. Kuo, Y. T. Chiang, W. C. Lin

Abstract:

Objectives: Qigong is an ancient Chinese practice in pursuit of a healthier body and a more peaceful mindset. It emphasizes on the restoration of vital energy (Qi) in body, mind, and spirit. The practice is the combination of gentle movements and mild breathing which help the doers reach the condition of tranquility. On account of the features of Qigong, first, we use cross-sectional methodology to compare the differences among the varied levels of Qigong practitioners on cognitive function with event-related potential (ERP) and electroencephalography (EEG). Second, we use the longitudinal methodology to explore the effects on the Qigong trainees for pretest and posttest on ERP and EEG. Current study adopts Attentional Network Test (ANT) task to examine the participants’ cognitive function, and aging-related researches demonstrated a declined tread on the cognition in older adults and exercise might ameliorate the deterioration. Qigong exercise integrates physical posture (muscle strength), breathing technique (aerobic ability) and focused intention (attention) that researchers hypothesize it might improve the cognitive function in aging adults. Method: Sixty participants were involved in this study, including 20 young adults (21.65±2.41 y) with normal physical activity (YA), 20 Qigong experts (60.69 ± 12.42 y) with over 7 years Qigong practice experience (QE), and 20 normal and healthy adults (52.90±12.37 y) with no Qigong practice experience as experimental group (EG). The EG participants took Qigong classes 2 times a week and 2 hours per time for 24 weeks with the purpose of examining the effect of Qigong intervention on cognitive function. ANT tasks (alert network, orient network, and executive control) were adopted to evaluate participants’ cognitive function via ERP’s P300 components and P300 amplitude topography. Results: Behavioral data: 1.The reaction time (RT) of YA is faster than the other two groups, and EG was faster than QE in the cue and flanker conditions of ANT task. 2. The RT of posttest was faster than pretest in EG in the cue and flanker conditions. 3. No difference among the three groups on orient, alert, and execute control networks. ERP data: 1. P300 amplitude detection in QE was larger than EG at Fz electrode in orient, alert, and execute control networks. 2. P300 amplitude in EG was larger at pretest than posttest on the orient network. 3. P300 Latency revealed no difference among the three groups in the three networks. Conclusion: Taken together these findings, they provide neuro-electrical evidence that older adults involved in Qigong practice may develop a more overall compensatory mechanism and also benefit the performance of behavior.

Keywords: Qigong, cognitive function, aging, event-related potential (ERP)

Procedia PDF Downloads 387
24439 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 314
24438 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction

Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto

Abstract:

Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.

Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data

Procedia PDF Downloads 95
24437 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models

Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling

Abstract:

Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.

Keywords: supplier selection, automotive supply chains, ANN, GEP

Procedia PDF Downloads 616
24436 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method

Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito

Abstract:

In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.

Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.

Procedia PDF Downloads 487
24435 Creep Effect on Composite Beam with Perfect Steel-Concrete Connection

Authors: Souici Abdelaziz, Tehami Mohamed, Rahal Nacer, Said Mohamed Bekkouche, Berthet Jean-Fabien

Abstract:

In this paper, the influence of the concrete slab creep on the initial deformability of a bent composite beam is modelled. This deformability depends on the rate of creep. This means the rise in value of the longitudinal strain ε c(x,t), the displacement D eflec(x,t) and the strain energy E(t). The variation of these three parameters can easily affect negatively the good appearance and the serviceability of the structure. Therefore, an analytical approach is designed to control the status of the deformability of the beam at the instant t. This approach is based on the Boltzmann’s superposition principle and very particularly on the irreversible law of deformation. For this, two conditions of compatibility and two other static equilibrium equations are adopted. The two first conditions are set according to the rheological equation of Dischinger. After having done a mathematical arrangement, we have reached a system of two differential equations whose integration allows to find the mathematical expression of each generalized internal force in terms of the ability of the concrete slab to creep.

Keywords: composite section, concrete, creep, deformation, differential equation, time

Procedia PDF Downloads 370
24434 Compressive Response of Unidirectional Basalt Fiber/Epoxy/MWCNTs Composites

Authors: Reza Eslami-Farsani, Hamed Khosravi

Abstract:

The aim of this work is to study the influence of multi-walled carbon nanotubes (MWCNTs) addition at various contents with respect to the matrix (0-0.5 wt.% at a step of 0.1 wt.%) on the compressive response of unidirectional basalt fiber (UD-BF)/epoxy composites. Toward this end, MWCNTs were firstly functionalized with 3-glycidoxypropyltrimethoxysilane (3-GPTMS) to improve their dispersion state and interfacial compatibility with the epoxy. Subsequently, UD-BF/epoxy and multiscale 3-GPTMS-MWCNTs/UD-BF/epoxy composites were prepared. The mechanical properties of the composites were determined by quasi-static compression test. The compressive strength of the composites was obtained through performing the compression test on the off-axis specimens and extracting their longitudinal compressive strength. Results demonstrated that the highest value in compressive strength was attained at 0.4 wt.% MWCNTs with 41% increase, compared to the BF/epoxy composite. Potential mechanisms behind these were implied.

Keywords: multiscale polymeric composites, unidirectional basalt fibers, multi-walled carbon nanotubes, surface modification, compressive properties

Procedia PDF Downloads 293
24433 Simulation of Acoustic Properties of Borate and Tellurite Glasses

Authors: M. S. Gaafar, S. Y. Marzouk, I. S. Mahmoud, S. Al-Zobaidi

Abstract:

Makishima and Mackenzie model was used to simulation of acoustic properties (longitudinal and shear ultrasonic wave velocities, elastic moduli theoretically for many tellurite and borate glasses. The model was proposed mainly depending on the values of the experimentally measured density, which are obtained before. In this search work, we are trying to obtain the values of densities of amorphous glasses (as the density depends on the geometry of the network structure of these glasses). In addition, the problem of simulating the slope of linear regression between the experimentally determined bulk modulus and the product of packing density and experimental Young's modulus, were solved in this search work. The results showed good agreement between the experimentally measured values of densities and both ultrasonic wave velocities, and those theoretically determined.

Keywords: glasses, ultrasonic wave velocities, elastic modulus, Makishima & Mackenzie Model

Procedia PDF Downloads 369
24432 Relationship between Institutional Perspective and Safety Performance: A Case on Ready-Made Garments Manufacturing Industry

Authors: Fahad Ibrahim, Raphaël Akamavi

Abstract:

Bangladesh has encountered several industrial disasters (e.g. fire and building collapse tragedies) leading to the loss of valuable human lives. Irrespective of various institutions’ making effort to improve the safety situation, industry compliance and safety behaviour have not yet been improved. Hence, one question remains, to what extent does the institutional elements efficient enough to make any difference in improving safety behaviours? Thus, this study explores the relationship between institutional perspective and safety performance. Structural equation modelling results, using survey data from 256 RMG workers’ of 128 garments manufacturing factories in Bangladesh, show that institutional facets strongly influence management safety commitment to induce workers participation in safety activities and reduce workplace accident rates. The study also found that by upholding industrial standards and inspecting the safety situations, institutions facets significantly and directly affect workers involvement in safety participations and rate of workplace accidents. Additionally, workers involvement to safety practices significantly predicts the safety environment of the workplace. Subsequently, our findings demonstrate that institutional culture, norms, and regulations enact play an important role in altering management commitment to set-up a safer workplace environment. As a result, when workers’ perceive their management having high level of commitment to safety, they are inspired to be involved more in the safety practices, which significantly alter the workplace safety situation and lessen injury experiences. Due to the fact that institutions have strong influence on management commitment, legislative members should endorse, regulate, and strictly monitor workplace safety laws to be exercised by the factory owners. Further, management should take initiatives for adopting OHS features and conceive strategic directions (i.e., set up safety committees, risk assessments, innovative training) for promoting a positive safety climate to provide a safe workplace environment. Arguably, an inclusive public-private partnership is recommended for ensuring better and safer workplace for RMG workers. However, as our data were under a cross-sectional design; the respondents’ perceptions might get changed over a period of time and hence, a longitudinal study is recommended. Finally, further research is needed to determine the impact of improvement mechanisms on workplace safety performance, such as how workplace design, safety training programs, and institutional enforcement policies protect the well-being of workers.

Keywords: institutional perspective, management commitment, safety participation, work injury, safety performance, occupational health and safety

Procedia PDF Downloads 196
24431 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 142
24430 Long Wavelength GaInNAs Based Hot Electron Light Emission VCSOAs

Authors: Faten Adel Ismael Chaqmaqchee

Abstract:

Optical, electrical and optical-electrical characterisations of surface light emitting VCSOAs devices are reported. The hot electron light emitting and lasing in semiconductor hetero-structure vertical cavity semiconductor optical amplifier (HELLISH VCSOA) device is a surface emitter based on longitudinal injection of electron and hole pairs in their respective channels. Ga0.35In0.65N0.02As0.08/GaAs was used as an active material for operation in the 1.3 μm window of the optical communications. The device has undoped Distributed Bragg Reflectors (DBRs) and the current is injected longitudinally, directly into the active layers and does not involve DBRs. Therefore, problems associated with refractive index contrast and current injection through the DBR layers, which are common with the doped DBRs in conventional VCSOAs, are avoided. The highest gain of around 4 dB is obtained for the 1300 nm wavelength operation.

Keywords: HELLISH, VCSOA, GaInNAs, luminescence, gain

Procedia PDF Downloads 352
24429 Long Short-Time Memory Neural Networks for Human Driving Behavior Modelling

Authors: Lu Zhao, Nadir Farhi, Yeltsin Valero, Zoi Christoforou, Nadia Haddadou

Abstract:

In this paper, a long short-term memory (LSTM) neural network model is proposed to replicate simultaneously car-following and lane-changing behaviors in road networks. By combining two kinds of LSTM layers and three input designs of the neural network, six variants of the LSTM model have been created. These models were trained and tested on the NGSIM 101 dataset, and the results were evaluated in terms of longitudinal speed and lateral position, respectively. Then, we compared the LSTM model with a classical car-following model (the intelligent driving model (IDM)) in the part of speed decision. In addition, the LSTM model is compared with a model using classical neural networks. After the comparison, the LSTM model demonstrates higher accuracy than the physical model IDM in terms of car-following behavior and displays better performance with regard to both car-following and lane-changing behavior compared to the classical neural network model.

Keywords: traffic modeling, neural networks, LSTM, car-following, lane-change

Procedia PDF Downloads 237
24428 Modeling Reflection and Transmission of Elastodiffussive Wave Sata Semiconductor Interface

Authors: Amit Sharma, J. N. Sharma

Abstract:

This paper deals with the study of reflection and transmission characteristics of acoustic waves at the interface of a semiconductor halfspace and elastic solid. The amplitude ratios (reflection and transmission coefficients) of reflected and transmitted waves to that of incident wave varying with the incident angles have been examined for the case of quasi-longitudinal wave. The special cases of normal and grazing incidence have also been derived with the help of Gauss elimination method. The mathematical model consisting of governing partial differential equations of motion and charge carriers diffusion of n-type semiconductors and elastic solid has been solved both analytically and numerically in the study. The numerical computations of reflection and transmission coefficients has been carried out by using MATLAB programming software for silicon (Si) semiconductor and copper elastic solid. The computer simulated results have been plotted graphically for Si semiconductors. The study may be useful in semiconductors, geology, and seismology in addition to surface acoustic wave (SAW) devices.

Keywords: quasilongitudinal, reflection and transmission, semiconductors, acoustics

Procedia PDF Downloads 382
24427 Internal Corrosion Rupture of a 6-in Gas Line Pipe

Authors: Fadwa Jewilli

Abstract:

A sudden leak of a 6-inch gas line pipe after being in service for one year was observed. The pipe had been designed to transport dry gas. The failure had taken place in 6 o’clock position at the stage discharge of the flow process. Laboratory investigations were conducted to find out the cause of the pipe rupture. Visual and metallographic observations confirmed that the pipe split was due to a crack initiated in circumferential and then turned into longitudinal direction. Sever wall thickness reduction was noticed on the internal pipe surface. Scanning electron microscopy observations at the fracture surface revealed features of ductile fracture mode. Corrosion product analysis showed the traces of iron carbonate and iron sulphate. The laboratory analysis resulted in the conclusion that the pipe failed due to the effect of wet fluid (condensate) caused severe wall thickness dissolution resulted in pipe could not stand the continuation at in-service working condition.

Keywords: gas line pipe, corrosion prediction ductile fracture, ductile fracture, failure analysis

Procedia PDF Downloads 73
24426 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 327
24425 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 420
24424 The Effect of "Trait" Variance of Personality on Depression: Application of the Trait-State-Occasion Modeling

Authors: Pei-Chen Wu

Abstract:

Both preexisting cross-sectional and longitudinal studies of personality-depression relationship have suffered from one main limitation: they ignored the stability of the construct of interest (e.g., personality and depression) can be expected to influence the estimate of the association between personality and depression. To address this limitation, the Trait-State-Occasion (TSO) modeling was adopted to analyze the sources of variance of the focused constructs. A TSO modeling was operated by partitioning a state variance into time-invariant (trait) and time-variant (occasion) components. Within a TSO framework, it is possible to predict change on the part of construct that really changes (i.e., time-variant variance), when controlling the trait variances. 750 high school students were followed for 4 waves over six-month intervals. The baseline data (T1) were collected from the senior high schools (aged 14 to 15 years). Participants were given Beck Depression Inventory and Big Five Inventory at each assessment. TSO modeling revealed that 70~78% of the variance in personality (five constructs) was stable over follow-up period; however, 57~61% of the variance in depression was stable. For personality construct, there were 7.6% to 8.4% of the total variance from the autoregressive occasion factors; for depression construct there were 15.2% to 18.1% of the total variance from the autoregressive occasion factors. Additionally, results showed that when controlling initial symptom severity, the time-invariant components of all five dimensions of personality were predictive of change in depression (Extraversion: B= .32, Openness: B = -.21, Agreeableness: B = -.27, Conscientious: B = -.36, Neuroticism: B = .39). Because five dimensions of personality shared some variance, the models in which all five dimensions of personality were simultaneous to predict change in depression were investigated. The time-invariant components of five dimensions were still significant predictors for change in depression (Extraversion: B = .30, Openness: B = -.24, Agreeableness: B = -.28, Conscientious: B = -.35, Neuroticism: B = .42). In sum, the majority of the variability of personality was stable over 2 years. Individuals with the greater tendency of Extraversion and Neuroticism have higher degrees of depression; individuals with the greater tendency of Openness, Agreeableness and Conscientious have lower degrees of depression.

Keywords: assessment, depression, personality, trait-state-occasion model

Procedia PDF Downloads 169
24423 Experimental Characterization of Anisotropic Mechanical Properties of Textile Woven Fabric

Authors: Rym Zouari, Sami Ben Amar, Abdelwaheb Dogui

Abstract:

This paper presents an experimental characterization of the anisotropic mechanical behavior of 4 textile woven fabrics with different weaves (Twill 3, Plain, Twill4 and Satin 4) by off-axis tensile testing. These tests are applied according seven directions oriented by 15° increment with respect to the warp direction. Fixed and articulated jaws are used. Analysis of experimental results is done through global (Effort/Elongation curves) and local scales. Global anisotropy was studied from the Effort/Elongation curves: shape, breaking load (Frup), tensile elongation (EMT), tensile energy (WT) and linearity index (LT). Local anisotropy was studied from the measurement of strain tensor components in the central area of the specimen as a function of testing orientation and effort: longitudinal strain ɛL, transverse strain ɛT and shearing ɛLT. The effect of used jaws is also analyzed.

Keywords: anisotropy, off-axis tensile test, strain fields, textile woven fabric

Procedia PDF Downloads 347
24422 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 393
24421 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 730
24420 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 224
24419 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 403