Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29453

Search results for: data interpolating empirical orthogonal function

26993 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 223
26992 Similarity Solutions of Nonlinear Stretched Biomagnetic Flow and Heat Transfer with Signum Function and Temperature Power Law Geometries

Authors: M. G. Murtaza, E. E. Tzirtzilakis, M. Ferdows

Abstract:

Biomagnetic fluid dynamics is an interdisciplinary field comprising engineering, medicine, and biology. Bio fluid dynamics is directed towards finding and developing the solutions to some of the human body related diseases and disorders. This article describes the flow and heat transfer of two dimensional, steady, laminar, viscous and incompressible biomagnetic fluid over a non-linear stretching sheet in the presence of magnetic dipole. Our model is consistent with blood fluid namely biomagnetic fluid dynamics (BFD). This model based on the principles of ferrohydrodynamic (FHD). The temperature at the stretching surface is assumed to follow a power law variation, and stretching velocity is assumed to have a nonlinear form with signum function or sign function. The governing boundary layer equations with boundary conditions are simplified to couple higher order equations using usual transformations. Numerical solutions for the governing momentum and energy equations are obtained by efficient numerical techniques based on the common finite difference method with central differencing, on a tridiagonal matrix manipulation and on an iterative procedure. Computations are performed for a wide range of the governing parameters such as magnetic field parameter, power law exponent temperature parameter, and other involved parameters and the effect of these parameters on the velocity and temperature field is presented. It is observed that for different values of the magnetic parameter, the velocity distribution decreases while temperature distribution increases. Besides, the finite difference solutions results for skin-friction coefficient and rate of heat transfer are discussed. This study will have an important bearing on a high targeting efficiency, a high magnetic field is required in the targeted body compartment.

Keywords: biomagnetic fluid, FHD, MHD, nonlinear stretching sheet

Procedia PDF Downloads 147
26991 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 399
26990 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 262
26989 On the Grid Technique by Approximating the Derivatives of the Solution of the Dirichlet Problems for (1+1) Dimensional Linear Schrodinger Equation

Authors: Lawrence A. Farinola

Abstract:

Four point implicit schemes for the approximation of the first and pure second order derivatives for the solution of the Dirichlet problem for one dimensional Schrodinger equation with respect to the time variable t were constructed. Also, special four-point implicit difference boundary value problems are proposed for the first and pure second derivatives of the solution with respect to the spatial variable x. The Grid method is also applied to the mixed second derivative of the solution of the Linear Schrodinger time-dependent equation. It is assumed that the initial function belongs to the Holder space C⁸⁺ᵃ, 0 < α < 1, the Schrodinger wave function given in the Schrodinger equation is from the Holder space Cₓ,ₜ⁶⁺ᵃ, ³⁺ᵃ/², the boundary functions are from C⁴⁺ᵃ, and between the initial and the boundary functions the conjugation conditions of orders q = 0,1,2,3,4 are satisfied. It is proven that the solution of the proposed difference schemes converges uniformly on the grids of the order O(h²+ k) where h is the step size in x and k is the step size in time. Numerical experiments are illustrated to support the analysis made.

Keywords: approximation of derivatives, finite difference method, Schrödinger equation, uniform error

Procedia PDF Downloads 110
26988 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 417
26987 Haematological Responses on Amateur Cycling Stages Race

Authors: Renato André S. Silva, Nana L. F. Sampaio, Carlos J. G. Cruz, Bruno Vianna, Flávio O. Pires

Abstract:

multiple stage bicycle races require high physiological loads from professional cyclists. Such demands can lead to immunosuppression and health problems. However, in this type of competition, little is known about its physiological effects on amateur athletes, who generally receive less medical support. Thus, this study analyzes the hematological effects of a multiple stage bicycle race on amateur cyclists. Seven Brazilian national amateur cyclists (34 ± 4.21 years) underwent a laboratory test to evaluate VO2Max (69.89 ± 7.43 ml⋅kg-1⋅min-1). Six days later, these volunteers raced in the Tour of Goiás, participating in five races in four days (435 km) of competition. Arterial blood samples were collected one day before and one day after the competition. The Kolmogorov-Smirnov tests were used to evaluate the data distribution and Wilcoxon to compare the two moments (p <0.05) of data collection. The results show: Red cells ↓ 7.8% (5.1 ± 0.28 vs 4.7 ± 0.37 106 / mm 3, p = 0.01); Hemoglobin ↓ 7.9% (15.1 ± 0.31 vs 13.9 ± 0.27 g / dL, p = 0.01); Leukocytes ↑ 9.5% (4946 ± 553 versus 5416 ± 1075 / mm 3, p = 0.17); Platelets ↓ 7.0% (200.2 ± 51.5 vs 186.1 ± 39.5 / mm 3, p = 0.01); LDH ↑ 11% (164.4 ± 28.5 vs 182.5 ± 20.5 U / L, p = 0.17); CK ↑ 13.5% (290.7 ± 206.1 vs 330.1 ± 90.5 U / L, p = 0.39); CK-MB ↑ 2% (15.7 ± 3.9 vs. 20.1 ± 2.9 U / L, p = 0.06); Cortizol ↓ 13.5% (12.1 ± 2.4 vs 9.9 ± 1.9 μg / dL, p = 0.01); Total testosterone ↓ 7% (453.6 ± 120.1 vs 421.7 ± 74.3 ng / dL, p = 0.12); IGF-1 ↓ 15.1% (213.8 ± 18.8 vs 181.5 ± 34.7 ng / mL, p = 0.04). This means that there was significant reductions in O2 allocation / transport capacities, vascular injury disruption, and a fortuitous reduction of muscle skeletal anabolism along with maintenance and / or slight elevation of immune function, glucose and lipid energy and myocardial damage. Therefore, the results suggest that no abnormal health effect was identified among the athletes after participating in the Tour de Goiás.

Keywords: cycling, health effects, cycling stages races, haematology

Procedia PDF Downloads 190
26986 Optimization of Machining Parameters of Wire Electric Discharge Machining (WEDM) of Inconel 625 Super Alloy

Authors: Amitesh Goswami, Vishal Gulati, Annu Yadav

Abstract:

In this paper, WEDM has been used to investigate the machining characteristics of Inconel-625 alloy. The machining characteristics namely material removal rate (MRR) and surface roughness (SR) have been investigated along with surface microstructure analysis using SEM and EDS of the machined surface. Taguchi’s L27 Orthogonal array design has been used by considering six varying input parameters viz. Pulse-on time (Ton), Pulse-off time (Toff), Spark Gap Set Voltage (SV), Peak Current (IP), Wire Feed (WF) and Wire Tension (WT) for the responses of interest. It has been found out that Pulse-on time (Ton) and Spark Gap Set Voltage (SV) are the most significant parameters affecting material removal rate (MRR) and surface roughness (SR) are. Microstructure analysis of workpiece was also done using Scanning Electron Microscope (SEM). It was observed that, variations in pulse-on time and pulse-off time causes varying discharge energy and as a result of which deep craters / micro cracks and large/ small number of debris were formed. These results were helpful in studying the effects of pulse-on time and pulse-off time on MRR and SR. Energy Dispersive Spectrometry (EDS) was also done to check the compositional analysis of the material and it was observed that Copper and Zinc which were initially not present in the Inconel 625, later migrated on the material surface from the brass wire electrode during machining

Keywords: MRR, SEM, SR, taguchi, Wire Electric Discharge Machining

Procedia PDF Downloads 338
26985 Data Mining As A Tool For Knowledge Management: A Review

Authors: Maram Saleh

Abstract:

Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.

Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.

Procedia PDF Downloads 192
26984 PPPs as Panacea to Delivery of Public Sector Construction Project in Zimbabwe

Authors: Ringisai Abigail Mawondo-Dhliwayo, Kahilu Kajimo-Shakantu

Abstract:

Due to financial challenges which governments in general face, it is becoming more difficult for many to continually use their limited resources to undertake infrastructural development. Governments increasingly now need other delivery approaches, in particular, the Public-Private Partnerships which make it possible for the public sector to achieve infrastructural development without incurring any/minimum cost. The literature reviewed outlined that benefits of PPPs include timely delivery of quality projects with cost limits. The methodology utilized for the empirical study comprised six interviews and sixty questionnaires which were undertaken and administered by construction consultants and government officials involved in PPPs projects. The results obtained showed that PPPs are not widely used in Zimbabwe although the need for their use exists. The study also found some challenges which prevent or derail the rate at which PPPs are utilized, of which the primary one was a political influence. It is concluded that despite limitations, PPPs remain the most effective and viable option for the delivery of government projects. The study recommends that policy and framework for the implementation of PPPs be developed. More useful information could have been obtained if final users of PPPs projects were included in the sample for data collection.

Keywords: construction projects, procurement, public private partnerships, public sector

Procedia PDF Downloads 240
26983 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 40
26982 Systematic Identification and Quantification of Substrate Specificity Determinants in Human Protein Kinases

Authors: Manuel A. Alonso-Tarajano, Roberto Mosca, Patrick Aloy

Abstract:

Protein kinases participate in a myriad of cellular processes of major biomedical interest. The in vivo substrate specificity of these enzymes is a process determined by several factors, and despite several years of research on the topic, is still far from being totally understood. In the present work, we have quantified the contributions to the kinase substrate specificity of i) the phosphorylation sites and their surrounding residues in the sequence and of ii) the association of kinases to adaptor or scaffold proteins. We have used position-specific scoring matrices (PSSMs), to represent the stretches of sequences phosphorylated by 93 families of kinases. We have found negative correlations between the number of sequences from which a PSSM is generated and the statistical significance and the performance of that PSSM. Using a subset of 22 statistically significant PSSMs, we have identified specificity determinant residues (SDRs) for 86% of the corresponding kinase families. Our results suggest that different SDRs can function as positive or negative elements of substrate recognition by the different families of kinases. Additionally, we have found that human proteins with known function as adaptors or scaffolds (kAS) tend to interact with a significantly large fraction of the substrates of the kinases to which they associate. Based on this characteristic we have identified a set of 279 potential adaptors/scaffolds (pAS) for human kinases, which is enriched in Pfam domains and functional terms tightly related to the proposed function. Moreover, our results show that for 74.6% of the kinase– pAS association found, the pAS colocalize with the substrates of the kinases they are associated to. Finally, we have found evidence suggesting that the association of kinases to adaptors and scaffolds, may contribute significantly to diminish the in vivo substrate crossed- specificity of protein kinases. In general, our results indicate the relevance of several SDRs for both the positive and negative selection of phosphorylation sites by kinase families and also suggest that the association of kinases to pAS proteins may be an important factor for the localization of the enzymes with their set of substrates.

Keywords: kinase, phosphorylation, substrate specificity, adaptors, scaffolds, cellular colocalization

Procedia PDF Downloads 330
26981 Sales-Based Dynamic Investment and Leverage Decisions: A Longitudinal Study

Authors: Rihab Belguith, Fathi Abid

Abstract:

The paper develops a system-based approach to investigate the dynamic adjustment of debt structure and investment policies of the Dow-Jones index. This approach enables the assessment of relations among sales, debt, and investment opportunities by considering the simultaneous effect of the market environmental change and future growth opportunities. We integrate the firm-specific sales variance to capture the industries' conditions in the model. Empirical results were obtained through a panel data set of firms with different sectors. The analysis support that environmental change does not affect equally the different industry since operating leverage differs among industries and so the sensitivity to sales variance. Including adjusted-specific variance, we find that there is no monotonic relation between leverage, sales, and investment. The firm may choose a low debt level in response to high sales variance but high leverage to attenuate the negative relation between sales variance and the current level of investment. We further find that while the overall effect of debt maturity on leverage is unaffected by the level of growth opportunities, the shorter the maturity of debt is, the smaller the direct effect of sales variance on investment.

Keywords: dynamic panel, investment, leverage decision, sales uncertainty

Procedia PDF Downloads 233
26980 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 464
26979 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 93
26978 The Digital Living Archive and the Construction of a Participatory Cultural Memory in the DARE-UIA Project: Digital Environment for Collaborative Alliances to Regenerate Urban Ecosystems in Middle-Sized Cities

Authors: Giulia Cardoni, Francesca Fabbrii

Abstract:

Living archives perform a function of social memory sharing, which contributes to building social bonds, communities, and identities. This potential lies in the ability to live archives to put together an archival function, which allows the conservation and transmission of memory with an artistic, performative and creative function linked to the present. As part of the DARE-UIA (Digital environment for collaborative alliances to regenerate urban ecosystems in middle-sized cities) project the creation of a living digital archive made it possible to create a narrative that would consolidate the cultural memory of the Darsena district of the city of Ravenna. The aim of the project is to stimulate the urban regeneration of a suburban area of a city, enhancing its cultural memory and identity heritage through digital heritage tools. The methodology used involves various digital storytelling actions necessary for the overall narrative using georeferencing systems (GIS), storymaps and 3D reconstructions for a transversal narration of historical content such as personal and institutional historical photos and to enhance the industrial archeology heritage of the neighborhood. The aim is the creation of an interactive and replicable narrative in similar contexts to the Darsena district in Ravenna. The living archive, in which all the digital contents are inserted, finds its manifestation towards the outside in the form of a museum spread throughout the neighborhood, making the contents usable on smartphones via QR codes and totems inserted on-site, creating thematic itineraries spread around the neighborhood. The construction of an interactive and engaging digital narrative has made it possible to enhance the material and immaterial heritage of the neighborhood by recreating the community that has historically always distinguished it.

Keywords: digital living archive, digital storytelling, GIS, 3D, open-air museum, urban regeneration, cultural memory

Procedia PDF Downloads 91
26977 Decision Quality as an Antecedent to Export Performance. Empirical Evidence under a Contingency Theory Lens

Authors: Evagelos Korobilis-Magas, Adekunle Oke

Abstract:

The constantly increasing tendency towards a global economy and the subsequent increase in exporting, as a result, has inevitably led to a growing interest in the topic of export success as well. Numerous studies, particularly in the past three decades, have examined a plethora of determinants to export performance. However, to the authors' best knowledge, no study up to date has ever considered decision quality as a potential antecedent to export success by attempting to test the relationship between decision quality and export performance. This is a surprising deficiency given that the export marketing literature has long ago suggested that quality decisions are regarded as the crucial intervening variable between sound decision–making and export performance. This study integrates the different definitions of decision quality proposed in the literature and the key themes incorporated therein and adapts it to an export context. Apart from laying the conceptual foundations for the delineation of this elusive but very important construct, this study is the first ever to test the relationship between decision quality and export performance. Based on survey data from a sample of 189 British export decision-makers and within a contingency theory framework, the results reveal that there is a direct, positive link between decision quality and export performance. This finding opens significant future research avenues and has very important implications for both theory and practice.

Keywords: export performance, decision quality, mixed methods, contingency theory

Procedia PDF Downloads 77
26976 Measurement of Influence of the COVID-19 Pandemic on Efficiency of Japan’s Railway Companies

Authors: Hideaki Endo, Mika Goto

Abstract:

The global outbreak of the COVID-19 pandemic has seriously affected railway businesses. The number of railway passengers decreased due to the decline in the number of commuters and business travelers to avoid crowded trains and a sharp drop in inbound tourists visiting Japan. This has affected not only railway businesses but also related businesses, including hotels, leisure businesses, and retail businesses at station buildings. In 2021, the companies were divided into profitable and loss-making companies. This division suggests that railway companies, particularly loss-making companies, needed to decrease operational inefficiency. To measure the impact of COVID-19 and discuss the sustainable management strategies of railway companies, we examine the cost inefficiency of Japanese listed railway companies by applying stochastic frontier analysis (SFA) to their operational and financial data. First, we employ the stochastic frontier cost function approach to measure inefficiency. The cost frontier function is formulated as a Cobb–Douglas type, and we estimated parameters and variables for inefficiency. This study uses panel data comprising 26 Japanese-listed railway companies from 2005 to 2020. This period includes several events deteriorating the business environment, such as the financial crisis from 2007 to 2008 and the Great East Japan Earthquake of 2011, and we compare those impacts with those of the COVID-19 pandemic after 2020. Second, we identify the characteristics of the best-practice railway companies and examine the drivers of cost inefficiencies. Third, we analyze the factors influencing cost inefficiency by comparing the profiles of the top 10 railway companies and others before and during the pandemic. Finally, we examine the relationship between cost inefficiency and the implementation of efficiency measures for each railway company. We obtained the following four findings. First, most Japanese railway companies showed the lowest cost inefficiency (most efficient) in 2014 and the highest in 2020 (least efficient) during the COVID-19 pandemic. The second worst occurred in 2009 when it was affected by the financial crisis. However, we did not observe a significant impact of the 2011 Great East Japan Earthquake. This is because no railway company was influenced by the earthquake in this operating area, except for JR-EAST. Second, the best-practice railway companies are KEIO and TOKYU. The main reason for their good performance is that both operate in and near the Tokyo metropolitan area, which is densely populated. Third, we found that non-best-practice companies had a larger decrease in passenger kilometers than best-practice companies. This indicates that passengers made fewer long-distance trips because they refrained from inter-prefectural travel during the pandemic. Finally, we found that companies that implement more efficiency improvement measures had higher cost efficiency and they effectively used their customer databases through proactive DX investments in marketing and asset management.

Keywords: COVID-19 pandemic, stochastic frontier analysis, railway sector, cost efficiency

Procedia PDF Downloads 59
26975 Design and Optimization for a Compliant Gripper with Force Regulation Mechanism

Authors: Nhat Linh Ho, Thanh-Phong Dao, Shyh-Chour Huang, Hieu Giang Le

Abstract:

This paper presents a design and optimization for a compliant gripper. The gripper is constructed based on the concept of compliant mechanism with flexure hinge. A passive force regulation mechanism is presented to control the grasping force a micro-sized object instead of using a sensor force. The force regulation mechanism is designed using the planar springs. The gripper is expected to obtain a large range of displacement to handle various sized objects. First of all, the statics and dynamics of the gripper are investigated by using the finite element analysis in ANSYS software. And then, the design parameters of the gripper are optimized via Taguchi method. An orthogonal array L9 is used to establish an experimental matrix. Subsequently, the signal to noise ratio is analyzed to find the optimal solution. Finally, the response surface methodology is employed to model the relationship between the design parameters and the output displacement of the gripper. The design of experiment method is then used to analyze the sensitivity so as to determine the effect of each parameter on the displacement. The results showed that the compliant gripper can move with a large displacement of 213.51 mm and the force regulation mechanism is expected to be used for high precision positioning systems.

Keywords: flexure hinge, compliant mechanism, compliant gripper, force regulation mechanism, Taguchi method, response surface methodology, design of experiment

Procedia PDF Downloads 313
26974 The Use Support Vector Machine and Back Propagation Neural Network for Prediction of Daily Tidal Levels Along The Jeddah Coast, Saudi Arabia

Authors: E. A. Mlybari, M. S. Elbisy, A. H. Alshahri, O. M. Albarakati

Abstract:

Sea level rise threatens to increase the impact of future storms and hurricanes on coastal communities. Accurate sea level change prediction and supplement is an important task in determining constructions and human activities in coastal and oceanic areas. In this study, support vector machines (SVM) is proposed to predict daily tidal levels along the Jeddah Coast, Saudi Arabia. The optimal parameter values of kernel function are determined using a genetic algorithm. The SVM results are compared with the field data and with back propagation (BP). Among the models, the SVM is superior to BPNN and has better generalization performance.

Keywords: tides, prediction, support vector machines, genetic algorithm, back-propagation neural network, risk, hazards

Procedia PDF Downloads 452
26973 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects

Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang

Abstract:

As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.

Keywords: 4D, 5D, 6D, active BIM

Procedia PDF Downloads 261
26972 Bright, Dark N-Soliton Solution of Fokas-Lenells Equation Using Hirota Bilinearization Method

Authors: Sagardeep Talukdar, Riki Dutta, Gautam Kumar Saharia, Sudipta Nandy

Abstract:

In non-linear optics, the Fokas-Lenells equation (FLE) is a well-known integrable equation that describes how ultrashort pulses move across the optical fiber. It admits localized wave solutions, just like any other integrable equation. We apply the Hirota bilinearization method to obtain the soliton solution of FLE. The proposed bilinearization makes use of an auxiliary function. We apply the method to FLE with a vanishing boundary condition, that is, to obtain a bright soliton solution. We have obtained bright 1-soliton and 2-soliton solutions and propose a scheme for obtaining an N-soliton solution. We have used an additional parameter that is responsible for the shift in the position of the soliton. Further analysis of the 2-soliton solution is done by asymptotic analysis. In the non-vanishing boundary condition, we obtain the dark 1-soliton solution. We discover that the suggested bilinearization approach, which makes use of the auxiliary function, greatly simplifies the process while still producing the desired outcome. We think that the current analysis will be helpful in understanding how FLE is used in nonlinear optics and other areas of physics.

Keywords: asymptotic analysis, fokas-lenells equation, hirota bilinearization method, soliton

Procedia PDF Downloads 94
26971 Crude Oil and Stocks Markets: Prices and Uncertainty Transmission Analysis

Authors: Kamel Malik Bensafta, Gervasio Semedo

Abstract:

The purpose of this paper is to investigate the relationship between oil prices and socks markets. The empirical analysis in this paper is conducted within the context of Multivariate GARCH models, using a transform version of the so-called BEKK parameterization. We show that mean and uncertainty of US market are transmitted to oil market and European market. We also identify an important transmission from WTI prices to Brent Prices.

Keywords: oil volatility, stock markets, MGARCH, transmission, structural break

Procedia PDF Downloads 510
26970 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 379
26969 Cultural Transformation in Interior Design in Commercial Space in India

Authors: Siddhi Pedamkar, Reenu Singh

Abstract:

This report is based on how a culture transforms from one era to another era in commercial space. This transformation is observed in commercial as well as residential spaces. The spaces have specific color concepts, surface detailing furniture, and function-specific layouts. But the cultural impact is very rarely seen in commercial spaces, mostly because the interior is divine by function to a large extent. Information was collected from books and research papers. A quantitative survey was conducted to understand people's perceptions about the impact of culture on design entities and how culture dictates the different types of space and their character. The survey also highlights the impact of types of interior lighting, colour schemes, and furniture types on the interior environment. The questionnaire survey helped in framing design parameters for contemporary interior design. The design parameters are used to propose design options for new-age furniture that can be used in co-working spaces. For the new and contemporary working spaces, new age design furniture, interior elements such as visual partition, semi-visual partition, lighting, and layout can be transformed by cultural changes in the working style of people and organization.

Keywords: commercial space, culture, environment, furniture, interior

Procedia PDF Downloads 93
26968 Acoustic Blood Plasmapheresis in Polymeric Resonators

Authors: Itziar Gonzalez, Pilar Carreras, Alberto Pinto, Roque Ruben Andres

Abstract:

Acoustophoretic separation of plasma from blood is based on a collection process of the blood cells, driven by an acoustic radiation force. The number of cells, their concentration, and the sample hydrodynamics are involved in these processes. However, their influence on the acoustic blood response has not yet been reported in the literature. Addressing it, this paper presents an experimental study of blood samples exposed to ultrasonic standing waves at different hematocrit levels and hydrodynamic conditions. The experiments were performed in a glass capillary (700µm-square cross section) actuated by a piezoelectric ceramic at 1MHz, hosting 2D orthogonal half-wavelength resonances transverse to the channel length, with a single-pressure-node along its central axis where cells collected driven by the acoustic radiation force. Four blood dilutions in PBS of 1:20, 1:10, 1:5, and 1:2 were tested at eight flow rate conditions Q=0:120µL/min. The 1:5 dilution (H=9%) demonstrated to be optimal for the plasmapheresis at any of the flow rates analyzed, requiring the shortest times to achieve plasma free of cells. The study opens new possibilities to optimize processes of plasmapheresis processes by ultrasounds at different hematocrit conditions in future personalized diagnoses/treatments involving blood samples.

Keywords: ultrasounds, microfluidics, flow rate, acoustophoresis, polymeric resonators

Procedia PDF Downloads 123
26967 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 497
26966 Optimization of Fermentation Parameters for Bioethanol Production from Waste Glycerol by Microwave Induced Mutant Escherichia coli EC-MW (ATCC 11105)

Authors: Refal Hussain, Saifuddin M. Nomanbhay

Abstract:

Glycerol is a valuable raw material for the production of industrially useful metabolites. Among many promising applications for the use of glycerol is its bioconversion to high value-added compounds, such as bioethanol through microbial fermentation. Bioethanol is an important industrial chemical with emerging potential as a biofuel to replace vanishing fossil fuels. The yield of liquid fuel in this process was greatly influenced by various parameters viz, temperature, pH, glycerol concentration, organic concentration, and agitation speed were considered. The present study was undertaken to investigate optimum parameters for bioethanol production from raw glycerol by immobilized mutant Escherichia coli (E.coli) (ATCC11505) strain on chitosan cross linked glutaraldehyde optimized by Taguchi statistical method in shake flasks. The initial parameters were set each at four levels and the orthogonal array layout of L16 (45) conducted. The important controlling parameters for optimized the operational fermentation was temperature 38 °C, medium pH 6.5, initial glycerol concentration (250 g/l), and organic source concentration (5 g/l). Fermentation with optimized parameters was carried out in a custom fabricated shake flask. The predicted value of bioethanol production under optimized conditions was (118.13 g/l). Immobilized cells are mainly used for economic benefits of continuous production or repeated use in continuous as well as in batch mode.

Keywords: bioethanol, Escherichia coli, immobilization, optimization

Procedia PDF Downloads 638
26965 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 361
26964 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 83