Search results for: cointegration approach in panel data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34622

Search results for: cointegration approach in panel data

33692 Development of a Data Security Model Using Steganography

Authors: Terungwa Simon Yange, Agana Moses A.

Abstract:

This paper studied steganography and designed a simplistic approach to a steganographic tool for hiding information in image files with the view of addressing the security challenges with data by hiding data from unauthorized users to improve its security. The Structured Systems Analysis and Design Method (SSADM) was used in this work. The system was developed using Java Development Kit (JDK) 1.7.0_10 and MySQL Server as its backend. The system was tested with some hypothetical health records which proved the possibility of protecting data from unauthorized users by making it secret so that its existence cannot be easily recognized by fraudulent users. It further strengthens the confidentiality of patient records kept by medical practitioners in the health setting. In conclusion, this work was able to produce a user friendly steganography software that is very fast to install and easy to operate to ensure privacy and secrecy of sensitive data. It also produced an exact copy of the original image and the one carrying the secret message when compared with each.

Keywords: steganography, cryptography, encryption, decryption, secrecy

Procedia PDF Downloads 267
33691 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes

Authors: Nadarajah I. Ramesh

Abstract:

Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.

Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model

Procedia PDF Downloads 279
33690 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach

Authors: Kanika Gupta, Ashok Kumar

Abstract:

Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.

Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database

Procedia PDF Downloads 172
33689 Labour Standards and Bilateral Migration Flows in ASEAN

Authors: Rusmawati Said, N. Kar Yee, Asmaddy Haris

Abstract:

This study employs a panel data set of ASEAN member states, 17 European Union (EU) countries, 7 American countries and 11 other Asia Pacific countries (China Mainland and Hong Kong SAR are treated as two separated countries) to investigate the role of labour standards in explaining the pattern of bilateral migration flows in ASEAN. Using pooled Ordinary Least Square (OLS) this study found mixed results. The result varies on how indicators were used to measure the level of labour standards in the empirical analysis. In one side, better labour standards (represented by number of strikes and weekly average working hours) promote bilateral migration among the selected countries. On the other side, increase in cases of occupational injuries lead to an increase in bilateral migration, reflecting that worsen in working conditions do not influence the workers’ decision from moving. The finding from this study become important to policy maker as the issues of massive low skilled workers have a significant impact to the role of labour standard in shaping the migration flows.

Keywords: labour standard, migration, ASEAN, economics and financial engineering

Procedia PDF Downloads 415
33688 Hotel and Service Industry in USA: Is It Leveraged? Case Study of Seven Important Hotel Chains

Authors: Azadeh Shahbazi

Abstract:

This study tries to find out the determinants of capital structure in hotel industry in 7 important hotel chains in USA within the period of 12 years of 2000 to 2012. The study is used a panel pooled regression to realize the relation among different variables. Results show that the variables which could make changes in the capital structure of firms are Non-Debt Tax Shield and Tangibility.

Keywords: capital structure, service industry, hospitality, finance

Procedia PDF Downloads 471
33687 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 75
33686 A Two-Step Framework for Unsupervised Speaker Segmentation Using BIC and Artificial Neural Network

Authors: Ahmad Alwosheel, Ahmed Alqaraawi

Abstract:

This work proposes a new speaker segmentation approach for two speakers. It is an online approach that does not require a prior information about speaker models. It has two phases, a conventional approach such as unsupervised BIC-based is utilized in the first phase to detect speaker changes and train a Neural Network, while in the second phase, the output trained parameters from the Neural Network are used to predict next incoming audio stream. Using this approach, a comparable accuracy to similar BIC-based approaches is achieved with a significant improvement in terms of computation time.

Keywords: artificial neural network, diarization, speaker indexing, speaker segmentation

Procedia PDF Downloads 505
33685 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand

Authors: Manit Pollar

Abstract:

Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.

Keywords: SARIMA, time series model, dengue cases, Thailand

Procedia PDF Downloads 360
33684 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 404
33683 Optical Multicast over OBS Networks: An Approach Based on Code-Words and Tunable Decoders

Authors: Maha Sliti, Walid Abdallah, Noureddine Boudriga

Abstract:

In the frame of this work, we present an optical multicasting approach based on optical code-words. Our approach associates, in the edge node, an optical code-word to a group multicast address. In the core node, a set of tunable decoders are used to send a traffic data to multiple destinations based on the received code-word. The use of code-words, which correspond to the combination of an input port and a set of output ports, allows the implementation of an optical switching matrix. At the reception of a burst, it will be delayed in an optical memory. And, the received optical code-word is split to a set of tunable optical decoders. When it matches a configured code-word, the delayed burst is switched to a set of output ports.

Keywords: optical multicast, optical burst switching networks, optical code-words, tunable decoder, virtual optical memory

Procedia PDF Downloads 608
33682 How to Use Big Data in Logistics Issues

Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy

Abstract:

Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.

Keywords: big data, logistics, operational efficiency, risk management

Procedia PDF Downloads 642
33681 Landscape Classification in North of Jordan by Integrated Approach of Remote Sensing and Geographic Information Systems

Authors: Taleb Odeh, Nizar Abu-Jaber, Nour Khries

Abstract:

The southern part of Wadi Al Yarmouk catchment area covers north of Jordan. It locates within latitudes 32° 20’ to 32° 45’N and longitudes 35° 42’ to 36° 23’ E and has an area of about 1426 km2. However, it has high relief topography where the elevation varies between 50 to 1100 meter above sea level. The variations in the topography causes different units of landforms, climatic zones, land covers and plant species. As a results of these different landscapes units exists in that region. Spatial planning is a major challenge in such a vital area for Jordan which could not be achieved without determining landscape units. However, an integrated approach of remote sensing and geographic information Systems (GIS) is an optimized tool to investigate and map landscape units of such a complicated area. Remote sensing has the capability to collect different land surface data, of large landscape areas, accurately and in different time periods. GIS has the ability of storage these land surface data, analyzing them spatially and present them in form of professional maps. We generated a geo-land surface data that include land cover, rock units, soil units, plant species and digital elevation model using ASTER image and Google Earth while analyzing geo-data spatially were done by ArcGIS 10.2 software. We found that there are twenty two different landscape units in the study area which they have to be considered for any spatial planning in order to avoid and environmental problems.

Keywords: landscape, spatial planning, GIS, spatial analysis, remote sensing

Procedia PDF Downloads 529
33680 Surveillance of Super-Extended Objects: Bimodal Approach

Authors: Andrey V. Timofeev, Dmitry Egorov

Abstract:

This paper describes an effective solution to the task of a remote monitoring of super-extended objects (oil and gas pipeline, railways, national frontier). The suggested solution is based on the principle of simultaneously monitoring of seismoacoustic and optical/infrared physical fields. The principle of simultaneous monitoring of those fields is not new but in contrast to the known solutions the suggested approach allows to control super-extended objects with very limited operational costs. So-called C-OTDR (Coherent Optical Time Domain Reflectometer) systems are used to monitor the seismoacoustic field. Far-CCTV systems are used to monitor the optical/infrared field. A simultaneous data processing provided by both systems allows effectively detecting and classifying target activities, which appear in the monitored objects vicinity. The results of practical usage had shown high effectiveness of the suggested approach.

Keywords: C-OTDR monitoring system, bimodal processing, LPboost, SVM

Procedia PDF Downloads 471
33679 Generalized Vortex Lattice Method for Predicting Characteristics of Wings with Flap and Aileron Deflection

Authors: Mondher Yahyaoui

Abstract:

A generalized vortex lattice method for complex lifting surfaces with flap and aileron deflection is formulated. The method is not restricted by the linearized theory assumption and accounts for all standard geometric lifting surface parameters: camber, taper, sweep, washout, dihedral, in addition to flap and aileron deflection. Thickness is not accounted for since the physical lifting body is replaced by a lattice of panels located on the mean camber surface. This panel lattice setup and the treatment of different wake geometries is what distinguish the present work form the overwhelming majority of previous solutions based on the vortex lattice method. A MATLAB code implementing the proposed formulation is developed and validated by comparing our results to existing experimental and numerical ones and good agreement is demonstrated. It is then used to study the accuracy of the widely used classical vortex-lattice method. It is shown that the classical approach gives good agreement in the clean configuration but is off by as much as 30% when a flap or aileron deflection of 30° is imposed. This discrepancy is mainly due the linearized theory assumption associated with the conventional method. A comparison of the effect of four different wake geometries on the values of aerodynamic coefficients was also carried out and it is found that the choice of the wake shape had very little effect on the results.

Keywords: aileron deflection, camber-surface-bound vortices, classical VLM, generalized VLM, flap deflection

Procedia PDF Downloads 435
33678 Task Based Language Learning: A Paradigm Shift in ESL/EFL Teaching and Learning: A Case Study Based Approach

Authors: Zehra Sultan

Abstract:

The study is based on the task-based language teaching approach which is found to be very effective in the EFL/ESL classroom. This approach engages learners to acquire the usage of authentic language skills by interacting with the real world through sequence of pedagogical tasks. The use of technology enhances the effectiveness of this approach. This study throws light on the historical background of TBLT and its efficacy in the EFL/ESL classroom. In addition, this study precisely talks about the implementation of this approach in the General Foundation Programme of Muscat College, Oman. It furnishes the list of the pedagogical tasks embedded in the language curriculum of General Foundation Programme (GFP) which are skillfully allied to the College Graduate Attributes. Moreover, the study also discusses the challenges pertaining to this approach from the point of view of teachers, students, and its classroom application. Additionally, the operational success of this methodology is gauged through formative assessments of the GFP, which is apparent in the students’ progress.

Keywords: task-based language teaching, authentic language, communicative approach, real world activities, ESL/EFL activities

Procedia PDF Downloads 126
33677 Identification of a Panel of Epigenetic Biomarkers for Early Detection of Hepatocellular Carcinoma in Blood of Individuals with Liver Cirrhosis

Authors: Katarzyna Lubecka, Kirsty Flower, Megan Beetch, Lucinda Kurzava, Hannah Buvala, Samer Gawrieh, Suthat Liangpunsakul, Tracy Gonzalez, George McCabe, Naga Chalasani, James M. Flanagan, Barbara Stefanska

Abstract:

Hepatocellular carcinoma (HCC), the most prevalent type of primary liver cancer, is the second leading cause of cancer death worldwide. Late onset of clinical symptoms in HCC results in late diagnosis and poor disease outcome. Approximately 85% of individuals with HCC have underlying liver cirrhosis. However, not all cirrhotic patients develop cancer. Reliable early detection biomarkers that can distinguish cirrhotic patients who will develop cancer from those who will not are urgently needed and could increase the cure rate from 5% to 80%. We used Illumina-450K microarray to test whether blood DNA, an easily accessible source of DNA, bear site-specific changes in DNA methylation in response to HCC before diagnosis with conventional tools (pre-diagnostic). Top 11 differentially methylated sites were selected for validation by pyrosequencing. The diagnostic potential of the 11 pyrosequenced probes was tested in blood samples from a prospective cohort of cirrhotic patients. We identified 971 differentially methylated CpG sites in pre-diagnostic HCC cases as compared with healthy controls (P < 0.05, paired Wilcoxon test, ICC ≥ 0.5). Nearly 76% of differentially methylated CpG sites showed lower levels of methylation in cases vs. controls (P = 2.973E-11, Wilcoxon test). Classification of the CpG sites according to their location relative to CpG islands and transcription start site revealed that those hypomethylated loci are located in regulatory regions important for gene transcription such as CpG island shores, promoters, and 5’UTR at higher frequency than hypermethylated sites. Among 735 CpG sites hypomethylated in cases vs. controls, 482 sites were assigned to gene coding regions whereas 236 hypermethylated sites corresponded to 160 genes. Bioinformatics analysis using GO, KEGG and DAVID knowledgebase indicate that differentially methylated CpG sites are located in genes associated with functions that are essential for gene transcription, cell adhesion, cell migration, and regulation of signal transduction pathways. Taking into account the magnitude of the difference, statistical significance, location, and consistency across the majority of matched pairs case-control, we selected 11 CpG loci corresponding to 10 genes for further validation by pyrosequencing. We established that methylation of CpG sites within 5 out of those 10 genes distinguish cirrhotic patients who subsequently developed HCC from those who stayed cancer free (cirrhotic controls), demonstrating potential as biomarkers of early detection in populations at risk. The best predictive value was detected for CpGs located within BARD1 (AUC=0.70, asymptotic significance ˂0.01). Using an additive logistic regression model, we further showed that 9 CpG loci within those 5 genes, that were covered in pyrosequenced probes, constitute a panel with high diagnostic accuracy (AUC=0.887; 95% CI:0.80-0.98). The panel was able to distinguish pre-diagnostic cases from cirrhotic controls free of cancer with 88% sensitivity at 70% specificity. Using blood as a minimally invasive material and pyrosequencing as a straightforward quantitative method, the established biomarker panel has high potential to be developed into a routine clinical test after validation in larger cohorts. This study was supported by Showalter Trust, American Cancer Society (IRG#14-190-56), and Purdue Center for Cancer Research (P30 CA023168) granted to BS.

Keywords: biomarker, DNA methylation, early detection, hepatocellular carcinoma

Procedia PDF Downloads 306
33676 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model

Authors: Didier Auroux, Vladimir Groza

Abstract:

This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.

Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization

Procedia PDF Downloads 317
33675 Identification of Ideal Plain Sufu (Fermented Soybean Curds) Based on Ideal Profile Method and Assessment of the Consistency of Ideal Profiles Obtained from Consumers

Authors: Yan Ping Chen, Hau Yin Chung

Abstract:

The Ideal Profile Method (IPM) is a newly developed descriptive sensory analysis conducted by consumers without previous training. To perform this test, both the perceived and the ideal intensities from the judgements of consumers on products’ attributes, as well as their hedonic ratings were collected for formulating an ideal product (the most liked one). In addition, Ideal Profile Analysis (IPA) was conducted to check the consistency of the ideal data at both the panel and consumer levels. In this test, 12 commercial plain sufus bought from Hong Kong local market were tested by 113 consumers according to the IPM, and rated on 22 attributes. Principal component analysis was used to profile the perceived and the ideal spaces of tested products. The consistency of ideal data was then checked by IPA. The result showed that most consumers shared a common ideal. It was observed that the sensory product space and the ideal product space were structurally similar. Their first dimensions all opposed products with intense fermented related aroma to products with less fermented related aroma. And the predicted ideal profile (the estimated liking score around 7.0 in a 9.0-point scale) got higher hedonic score than the tested products (the average liking score around 6.0 in a 9.0-point scale). For the majority of consumers (95.2%), the stated ideal product considered as a potential ideal through checking the R2 coefficient value. Among all the tested products, sample-6 was the most popular one with consumer liking percentage around 30%. This product with less fermented and moldy flavour but easier to melt in mouth texture possessed close sensory profile according to the ideal product. This experiment validated that data from untrained consumers could be guided as useful information. Appreciated sensory characteristics could be served as reference in the optimization of the commercial plain sufu.

Keywords: ideal profile method, product development, sensory evaluation, sufu (fermented soybean curd)

Procedia PDF Downloads 188
33674 A Method for Identifying Unusual Transactions in E-commerce Through Extended Data Flow Conformance Checking

Authors: Handie Pramana Putra, Ani Dijah Rahajoe

Abstract:

The proliferation of smart devices and advancements in mobile communication technologies have permeated various facets of life with the widespread influence of e-commerce. Detecting abnormal transactions holds paramount significance in this realm due to the potential for substantial financial losses. Moreover, the fusion of data flow and control flow assumes a critical role in the exploration of process modeling and data analysis, contributing significantly to the accuracy and security of business processes. This paper introduces an alternative approach to identify abnormal transactions through a model that integrates both data and control flows. Referred to as the Extended Data Petri net (DPNE), our model encapsulates the entire process, encompassing user login to the e-commerce platform and concluding with the payment stage, including the mobile transaction process. We scrutinize the model's structure, formulate an algorithm for detecting anomalies in pertinent data, and elucidate the rationale and efficacy of the comprehensive system model. A case study validates the responsive performance of each system component, demonstrating the system's adeptness in evaluating every activity within mobile transactions. Ultimately, the results of anomaly detection are derived through a thorough and comprehensive analysis.

Keywords: database, data analysis, DPNE, extended data flow, e-commerce

Procedia PDF Downloads 57
33673 Using a Quantitative Reasoning Framework to Help Students Understand Arc Measure Relationships

Authors: David Glassmeyer

Abstract:

Quantitative reasoning is necessary to robustly understand mathematical concepts ranging from elementary to university levels. Quantitative reasoning involves identifying and representing quantities and the relationships between these quantities. Without reasoning quantitatively, students often resort to memorizing formulas and procedures, which have negative impacts when they encounter mathematical topics in the future. This study investigated how high school students’ quantitative reasoning could be fostered within a unit on arc measure and angle relationships. Arc measure, or the measure of a central angle that cuts off a portion of a circle’s circumference, is often confused with arclength. In this study, the researcher redesigned an activity to clearly distinguish arc measure and arc length by using a quantitative reasoning framework. Data were collected from high school students to determine if this approach impacted their understanding of these concepts. Initial data indicates the approach was successful in supporting students’ quantitative reasoning of these topics. Implications for the work are that teachers themselves may also benefit from considering mathematical definitions from a quantitative reasoning framework and can use this activity in their own classrooms.

Keywords: arc length, arc measure, quantitative reasoning, student content knowledge

Procedia PDF Downloads 258
33672 The Approach of Male and Female Spectators about the Presence of Female Spectators in Sport Stadiums of Iran

Authors: Mohammad Reza Boroumand Devlagh, Seyed Mohammad Hosein Razavi, Fatemeh Ahmadi, Azam Fazli Darzi

Abstract:

The issue of female presence in Iran stadiums has long been considered and debated by governmental experts and authorities, however, no conclusion is yielded yet. Thus, the present study has been done with the aim of investigating the approach of male and female spectators about the presence of female spectators in Iranian stadiums. The statistical population of the study includes all male and female spectators who have not experienced the live watching of male championship matches in stadiums. 224 subjects from the statistical population have selected through stratified random sampling as the sample of the study. For data collection, researcher-made questionnaire has been used whose validity has been confirmed by the university professors and its reliability has been studied and confirmed through an preliminary study. (r= 0.81). Data analysis has been done using descriptive and referential statistics in P< 0.05. The results of the study showed that male and female were meaningfully agreed with the female presence in stadiums and there is no meaningful difference between male and female approaches concerning the female spectators’ presence in sport stadiums of Iran (sig= 0.867).

Keywords: male, female spectators, Iran, sport stadiums, population

Procedia PDF Downloads 550
33671 Feedback Preference and Practice of English Majors’ in Pronunciation Instruction

Authors: Claerchille Jhulia Robin

Abstract:

This paper discusses the perspective of ESL learners towards pronunciation instruction. It sought to determine how these learners view the type of feedback their speech teacher gives and its impact on their own classroom practice of providing feedback. This study utilized a quantitative-qualitative approach to the problem. The respondents were Education students majoring in English. A survey questionnaire and interview guide were used for data gathering. The data from the survey was tabulated using frequency count and the data from the interview were then transcribed and analyzed. Results showed that ESL learners favor immediate corrective feedback and they do not find any issue in being corrected in front of their peers. They also practice the same corrective technique in their own classroom.

Keywords: ESL, feedback, learner perspective, pronunciation instruction

Procedia PDF Downloads 235
33670 A Novel Approach of Power Transformer Diagnostic Using 3D FEM Parametrical Model

Authors: M. Brandt, A. Peniak, J. Makarovič, P. Rafajdus

Abstract:

This paper deals with a novel approach of power transformers diagnostics. This approach identifies the exact location and the range of a fault in the transformer and helps to reduce operation costs related to handling of the faulty transformer, its disassembly and repair. The advantage of the approach is a possibility to simulate healthy transformer and also all faults, which can occur in transformer during its operation without its disassembling, which is very expensive in practice. The approach is based on creating frequency dependent impedance of the transformer by sweep frequency response analysis measurements and by 3D FE parametrical modeling of the fault in the transformer. The parameters of the 3D FE model are the position and the range of the axial short circuit. Then, by comparing the frequency dependent impedances of the parametrical models with the measured ones, the location and the range of the fault is identified. The approach was tested on a real transformer and showed high coincidence between the real fault and the simulated one.

Keywords: transformer, parametrical model of transformer, fault, sweep frequency response analysis, finite element method

Procedia PDF Downloads 483
33669 Elevated Celiac Antibodies and Abnormal Duodenal Biopsies Associated with IBD Markers: Possible Role of Altered Gut Permeability and Inflammation in Gluten Related Disorders

Authors: Manav Sabharwal, Ruda Rai Md, Candace Parker, James Ridley

Abstract:

Wheat is one of the most commonly consumed grains worldwide, which contains gluten. Nowadays, gluten intake is considered to be a trigger for GRDs, including Celiac disease (CD), a common genetic disease affecting 1% of the US population, non-celiac gluten sensitivity (NCGS) and wheat allergy. NCGS is being recognized as an acquired gluten-sensitive enteropathy that is prevalent across age, ethnic and geographic groups. The cause of this entity is not fully understood, and recent studies suggest that it is more common in participants with irritable bowel syndrome (IBS), with iron deficiency anemia, symptoms of fatigue, and has considerable overlap in symptoms with IBS and Crohn’s disease. However, these studies were lacking in availability of complete serologies, imaging tests and/or pan-endoscopy. We performed a prospective study of 745 adult patients who presented to an outpatient clinic for evaluation of chronic upper gastro-intestinal symptoms and subsequently underwent an upper endoscopic (EGD) examination as standard of care. Evaluation comprised of comprehensive celiac antibody panel, inflammatory bowel disease (IBD) serologic markers, duodenal biopsies and Small Bowel Video Capsule Endoscopy (VCE), when available. At least 6 biopsy specimens were obtained from the duodenum and proximal jejunum during EGD, and CD3+ Intraepithelial lymphocytes (IELs) and villous architecture were evaluated by a single experienced pathologist, and VCE was performed by a single experienced gastroenterologist. Of the 745 patients undergoing EGD, 12% (93/745) patients showed elevated CD3+ IELs in the duodenal biopsies. 52% (387/745) completed a comprehensive CD panel and 7.2% (28/387) were positive for at least 1 CD antibody (Tissue transglutaminase (tTG), being the most common antibody in 65% (18/28)). Of these patients, 18% (5/28) showed increased duodenal CD3+ IELs, but 0% showed villous blunting or distortion to meet criteria for CD. Surprisingly, 43% (12/28) were positive for at 1 IBD serology (ASCA, ANCA or expanded IBD panel (LabCorp)). Of these 28 patients, 29% (8/28) underwent a SB VCE, of which 100 % (8/8) showed significant jejuno-ileal mucosal lesions diagnostic for IBD. Findings of abnormal CD antibodies (7.2%, 28/387) and increased CD3+ IELs on duodenal biopsy (12%, 93/745) were observed frequently in patients with UGI symptoms undergoing EGD in an outpatient clinic. None met criteria for CD, and a high proportion (43%, 12/28) showed evidence of overlap with IBD. This suggests a potential causal link of acquired GRDs to underlying inflammation and gut mucosal barrier disruption. Further studies to investigate a role for abnormal antigen presentation of dietary gluten to gut associated lymphoid tissue as a cause are justified. This may explain a high prevalence of GRDs in the population and correlation with IBS, IBD and other gut inflammatory disorders.

Keywords: celiac, gluten sensitive enteropathy, lymphocitic enteritis, IBS, IBD

Procedia PDF Downloads 169
33668 The Extent of Land Use Externalities in the Fringe of Jakarta Metropolitan: An Application of Spatial Panel Dynamic Land Value Model

Authors: Rahma Fitriani, Eni Sumarminingsih, Suci Astutik

Abstract:

In a fast growing region, conversion of agricultural lands which are surrounded by some new development sites will occur sooner than expected. This phenomenon has been experienced by many regions in Indonesia, especially the fringe of Jakarta (BoDeTaBek). Being Indonesia’s capital city, rapid conversion of land in this area is an unavoidable process. The land conversion expands spatially into the fringe regions, which were initially dominated by agricultural land or conservation sites. Without proper control or growth management, this activity will invite greater costs than benefits. The current land use is the use which maximizes its value. In order to maintain land for agricultural activity or conservation, some efforts are needed to keep the land value of this activity as high as possible. In this case, the knowledge regarding the functional relationship between land value and its driving forces is necessary. In a fast growing region, development externalities are the assumed dominant driving force. Land value is the product of the past decision of its use leading to its value. It is also affected by the local characteristics and the observed surrounded land use (externalities) from the previous period. The effect of each factor on land value has dynamic and spatial virtues; an empirical spatial dynamic land value model will be more useful to capture them. The model will be useful to test and to estimate the extent of land use externalities on land value in the short run as well as in the long run. It serves as a basis to formulate an effective urban growth management’s policy. This study will apply the model to the case of land value in the fringe of Jakarta Metropolitan. The model will be used further to predict the effect of externalities on land value, in the form of prediction map. For the case of Jakarta’s fringe, there is some evidence about the significance of neighborhood urban activity – negative externalities, the previous land value and local accessibility on land value. The effects are accumulated dynamically over years, but they will fully affect the land value after six years.

Keywords: growth management, land use externalities, land value, spatial panel dynamic

Procedia PDF Downloads 257
33667 Classification Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach

Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno

Abstract:

Banda Sea collision zone (BSCZ) of is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location in the eastern part of Indonesia. This zone has a very high seismic activity. In this research, we will be calculated rate (λ) and Mean Square Eror (MSE). By this result, we will identification of Poisson distribution of earthquakes in the BSCZ with the point process approach. Chi-square test approach and test Anscombe made in the process of identifying a Poisson distribution in the partition area. The data used are earthquakes with Magnitude ≥ 6 SR and its period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.

Keywords: molluca banda sea collision zone, earthquakes, mean square error, poisson distribution, chi-square test, anscombe test

Procedia PDF Downloads 301
33666 Feature Extraction and Classification Based on the Bayes Test for Minimum Error

Authors: Nasar Aldian Ambark Shashoa

Abstract:

Classification with a dimension reduction based on Bayesian approach is proposed in this paper . The first step is to generate a sample (parameter) of fault-free mode class and faulty mode class. The second, in order to obtain good classification performance, a selection of important features is done with the discrete karhunen-loeve expansion. Next, the Bayes test for minimum error is used to classify the classes. Finally, the results for simulated data demonstrate the capabilities of the proposed procedure.

Keywords: analytical redundancy, fault detection, feature extraction, Bayesian approach

Procedia PDF Downloads 528
33665 A Human Activity Recognition System Based on Sensory Data Related to Object Usage

Authors: M. Abdullah, Al-Wadud

Abstract:

Sensor-based activity recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.

Keywords: Naïve Bayesian, based classification, activity recognition, sensor data, object-usage model

Procedia PDF Downloads 323
33664 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning

Authors: Tanvi P. Patel, Warish D. Patel

Abstract:

Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.

Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern

Procedia PDF Downloads 372
33663 An Approach from Fichte as a Response to the Kantian Dualism of Subject and Object: The Unity of the Subject and Object in Both Theoretical and Ethical Possibility

Authors: Mengjie Liu

Abstract:

This essay aims at responding to the Kant arguments on how to fit the self-caused subject into the deterministic object which follows the natural laws. This essay mainly adopts the approach abstracted from Fichte’s “Wissenshaftslehre” (Doctrine of Science) to picture a possible solution to the conciliation of Kantian dualism. The Fichte approach is based on the unity of the theoretical and practical reason, which can be understood as a philosophical abstraction from ordinary experience combining both subject and object. This essay will discuss the general Kantian dualism problem and Fichte’s unity approach in the first part. Then the essay will elaborate on the achievement of this unity of the subject and object through Fichte’s “the I posits itself” process in the second section. The following third section is related to the ethical unity of subject and object based on the Fichte approach. The essay will also discuss the limitation of Fichte’s approach from two perspectives: (1) the theoretical possibility of the existence of the pure I and (2) Schelling’s statement that the Absolute I is a result rather than the originating act. This essay demonstrates a possible approach to unifying the subject and object supported by Fichte’s “Absolute I” and ethical theories and also points out the limitations of Fichte’s theories.

Keywords: Fichte, identity, Kantian dualism, Wissenshaftslehre

Procedia PDF Downloads 94