Search results for: ERA-5 analysis data
40629 A Data-Driven Agent Based Model for the Italian Economy
Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio
Abstract:
We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data
Procedia PDF Downloads 7240628 Creative Mapping Landuse and Human Activities: From the Inventories of Factories to the History of the City and Citizens
Authors: R. Tamborrino, F. Rinaudo
Abstract:
Digital technologies offer possibilities to effectively convert historical archives into instruments of knowledge able to provide a guide for the interpretation of historical phenomena. Digital conversion and management of those documents allow the possibility to add other sources in a unique and coherent model that permits the intersection of different data able to open new interpretations and understandings. Urban history uses, among other sources, the inventories that register human activities in a specific space (e.g. cadastres, censuses, etc.). The geographic localisation of that information inside cartographic supports allows for the comprehension and visualisation of specific relationships between different historical realities registering both the urban space and the peoples living there. These links that merge the different nature of data and documentation through a new organisation of the information can suggest a new interpretation of other related events. In all these kinds of analysis, the use of GIS platforms today represents the most appropriate answer. The design of the related databases is the key to realise the ad-hoc instrument to facilitate the analysis and the intersection of data of different origins. Moreover, GIS has become the digital platform where it is possible to add other kinds of data visualisation. This research deals with the industrial development of Turin at the beginning of the 20th century. A census of factories realized just prior to WWI provides the opportunity to test the potentialities of GIS platforms for the analysis of urban landscape modifications during the first industrial development of the town. The inventory includes data about location, activities, and people. GIS is shaped in a creative way linking different sources and digital systems aiming to create a new type of platform conceived as an interface integrating different kinds of data visualisation. The data processing allows linking this information to an urban space, and also visualising the growth of the city at that time. The sources, related to the urban landscape development in that period, are of a different nature. The emerging necessity to build, enlarge, modify and join different buildings to boost the industrial activities, according to their fast development, is recorded by different official permissions delivered by the municipality and now stored in the Historical Archive of the Municipality of Turin. Those documents, which are reports and drawings, contain numerous data on the buildings themselves, including the block where the plot is located, the district, and the people involved such as the owner, the investor, and the engineer or architect designing the industrial building. All these collected data offer the possibility to firstly re-build the process of change of the urban landscape by using GIS and 3D modelling technologies thanks to the access to the drawings (2D plans, sections and elevations) that show the previous and the planned situation. Furthermore, they access information for different queries of the linked dataset that could be useful for different research and targets such as economics, biographical, architectural, or demographical. By superimposing a layer of the present city, the past meets to the present-industrial heritage, and people meet urban history.Keywords: digital urban history, census, digitalisation, GIS, modelling, digital humanities
Procedia PDF Downloads 19140627 Ergonomical Study of Hand-Arm Vibrational Exposure in a Gear Manufacturing Plant in India
Authors: Santosh Kumar, M. Muralidhar
Abstract:
The term ‘ergonomics’ is derived from two Greek words: ‘ergon’, meaning work and ‘nomoi’, meaning natural laws. Ergonomics is the study of how working conditions, machines and equipment can be arranged in order that people can work with them more efficiently. In this research communication an attempt has been made to study the effect of hand-arm vibrational exposure on the workers of a gear manufacturing plant by comparison of potential Carpal Tunnel Syndrome (CTS) symptoms and effect of different exposure levels of vibration on occurrence of CTS in actual industrial environment. Chi square test and correlation analysis have been considered for statistical analysis. From Chi square test, it has been found that the potential CTS symptoms occurrence is significantly dependent on the level of vibrational exposure. Data analysis indicates that 40.51% workers having potential CTS symptoms are exposed to vibration. Correlation analysis reveals that potential CTS symptoms are significantly correlated with exposure to level of vibration from handheld tools and to repetitive wrist movements.Keywords: CTS symptoms, hand-arm vibration, ergonomics, physical tests
Procedia PDF Downloads 37340626 Comprehensive Profiling and Characterization of Untargeted Extracellular Metabolites in Fermentation Processes: Insights and Advances in Analysis and Identification
Authors: Marianna Ciaccia, Gennaro Agrimi, Isabella Pisano, Maurizio Bettiga, Silvia Rapacioli, Giulia Mensa, Monica Marzagalli
Abstract:
Objective: Untargeted metabolomic analysis of extracellular metabolites is a powerful approach that focuses on comprehensively profiling in the extracellular space. In this study, we applied extracellular metabolomic analysis to investigate the metabolism of two probiotic microorganisms with health benefits that extend far beyond the digestive tract and the immune system. Methods: Analytical techniques employed in extracellular metabolomic analysis encompass various technologies, including mass spectrometry (MS), which enables the identification of metabolites present in the fermentation media, as well as the comparison of metabolic profiles under different experimental conditions. Multivariate statistical analysis techniques like principal component analysis (PCA) or partial least squares-discriminant analysis (PLS-DA) play a crucial role in uncovering metabolic signatures and understanding the dynamics of metabolic networks. Results: Different types of supernatants from fermentation processes, such as dairy-free, not dairy-free media and media with no cells or pasteurized, were subjected to metabolite profiling, which contained a complex mixture of metabolites, including substrates, intermediates, and end-products. This profiling provided insights into the metabolic activity of the microorganisms. The integration of advanced software tools has facilitated the identification and characterization of metabolites in different fermentation conditions and microorganism strains. Conclusions: In conclusion, untargeted extracellular metabolomic analysis, combined with software tools, allowed the study of the metabolites consumed and produced during the fermentation processes of probiotic microorganisms. Ongoing advancements in data analysis methods will further enhance the application of extracellular metabolomic analysis in fermentation research, leading to improved bioproduction and the advancement of sustainable manufacturing processes.Keywords: biotechnology, metabolomics, lactic bacteria, probiotics, postbiotics
Procedia PDF Downloads 7240625 Numerical Modeling for Water Engineering and Obstacle Theory
Authors: Mounir Adal, Baalal Azeddine, Afifi Moulay Larbi
Abstract:
Numerical analysis is a branch of mathematics devoted to the development of iterative matrix calculation techniques. We are searching for operations optimization as objective to calculate and solve systems of equations of order n with time and energy saving for computers that are conducted to calculate and analyze big data by solving matrix equations. Furthermore, this scientific discipline is producing results with a margin of error of approximation called rates. Thus, the results obtained from the numerical analysis techniques that are held on computer software such as MATLAB or Simulink offers a preliminary diagnosis of the situation of the environment or space targets. By this we can offer technical procedures needed for engineering or scientific studies exploitable by engineers for water.Keywords: numerical analysis methods, obstacles solving, engineering, simulation, numerical modeling, iteration, computer, MATLAB, water, underground, velocity
Procedia PDF Downloads 46540624 Analysis of Computer Science Papers Conducted by Board of Intermediate and Secondary Education at Secondary Level
Authors: Ameema Mahroof, Muhammad Saeed
Abstract:
The purpose of this study was to analyze the papers of computer science conducted by Board of Intermediate and Secondary Education with reference to Bloom’s taxonomy. The present study has two parts. First, the analysis is done on the papers conducted by Board of Intermediate of Secondary Education on the basis of basic rules of item construction especially Bloom’s (1956). And the item analysis is done to improve the psychometric properties of a test. The sample included the question papers of computer science of higher secondary classes (XI-XII) for the years 2011 and 2012. For item analysis, the data was collected from 60 students through convenient sampling. Findings of the study revealed that in the papers by Board of intermediate and secondary education the maximum focus was on knowledge and understanding level and very less focus was on the application, analysis, and synthesis. Furthermore, the item analysis on the question paper reveals that item difficulty of most of the questions did not show a balanced paper, the items were either very difficult while most of the items were too easy (measuring knowledge and understanding abilities). Likewise, most of the items were not truly discriminating the high and low achievers; four items were even negatively discriminating. The researchers also analyzed the items of the paper through software Conquest. These results show that the papers conducted by Board of Intermediate and Secondary Education were not well constructed. It was recommended that paper setters should be trained in developing the question papers that can measure various cognitive abilities of students so that a good paper in computer science should assess all cognitive abilities of students.Keywords: Bloom’s taxonomy, question paper, item analysis, cognitive domain, computer science
Procedia PDF Downloads 15340623 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme
Procedia PDF Downloads 48440622 Efforts to Revitalize Piipaash Language: An Explorative Study to Develop Culturally Appropriate and Contextually Relevant Teaching Materials for Preschoolers
Authors: Shahzadi Laibah Burq, Gina Scarpete Walters
Abstract:
Piipaash, representing one large family of North American languages, Yuman, is reported as one of the seriously endangered languages in the Salt River Pima-Maricopa Indian Community of Arizona. In a collaborative venture between Arizona State University (ASU) and Salt River Pima-Maricopa Indian Community (SRPMIC), efforts have been made to revitalize and preserve the Piipaash language and its cultural heritage. The present study is one example of several other language documentation and revitalization initiatives that Humanities Lab ASU has taken. This study was approved to receive a “Beyond the lab” grant after the researchers successfully created a Teaching Guide for Early Childhood Piipaash storybook during their time working in the Humanities Lab. The current research is an extension of the previous project and focuses on creating customized teaching materials and tools for the teachers and parents of the students of the Early Enrichment Program at SRPMIC. However, to determine and maximize the usefulness of the teaching materials with regards to their reliability, validity, and practicality in the given context, this research aims to conduct Environmental Analysis and Need Analysis. Environmental Analysis seeks to evaluate the Early Enrichment Program situation and Need Analysis to investigate the specific and situated requirements of the teachers to assist students in building target language skills. The study employs a qualitative methods approach for the collection of the data. Multiple data collection strategies are used concurrently to gather information from the participants. The research tools include semi-structured interviews with the program administrators and teachers, classroom observations, and teacher shadowing. The researchers utilize triangulation of the data to maintain validity in the process of data interpretation. The preliminary results of the study show a need for culturally appropriate materials that can further the learning of students of the target language as well as the culture, i.e., clay pots and basket-making materials. It was found that the course and teachers focus on developing the Listening and Speaking skills of the students. Moreover, to assist the young learners beyond the classroom, the teachers could make use of send-home teaching materials to reinforce the learning (i.e., coloring books, including illustrations of culturally relevant animals, food, and places). Audio language resources are also identified as helpful additional materials for the parents to assist the learning of the kids.Keywords: indigenous education, materials development, need analysis, piipaash language revitalizaton
Procedia PDF Downloads 9140621 Bacteriological Analysis of Logan's Branch Rowan County, Kentucky Utilizing Membrane Filtration Method
Authors: Elizabeth G. Hereford, Geoffrey W. Gearner
Abstract:
Logan’s Branch, within the Triplett Creek Watershed of Rowan County, Kentucky, is a waterway located near important agricultural and residential areas. Part of Logan’s Branch flows over an exposed black shale formation with elevated radioactivity and heavy metals. Three sites were chosen in relation to the formation and sampled five times over a thirty-day period during the recreational season. A fourth site in North Fork in Rowan County, Kentucky was also sampled periodically as it too has contact with the shale formation. These sites were then sampled monthly. All samples are analyzed for concentrations of Escherichia coli, heterotrophic bacteria, and total coliform bacteria utilizing the membrane filtration method and various culture media. Current data suggests that the radioactivity of the shale formation influences the bacteriological growth present in the waterway; however, further data will be collected and compared with that of my colleagues to confirm this trend.Keywords: bacteriological analysis, Escherichia coli, heterotrophic bacteria, radioactive black shale formation, water quality
Procedia PDF Downloads 18940620 A Study of Student Satisfaction of the University TV Station
Authors: Prapoj Na Bangchang
Abstract:
This research aimed to study the satisfaction of university students on the Suan Sunandha Rajabhat University television station. The sample were 250 undergraduate students from Year 1 to Year 4. The tool used to collect data was a questionnaire. Statistics used in data analysis were percentage, mean and standard deviation. The results showed that student satisfaction on the University's television station location received high score, followed by the number of devices, and the content presented received the lowest score. Most students want the content of the programs to be improved especially entertainment content, followed by sports content.Keywords: student satisfaction, university TV channel, media, broadcasting
Procedia PDF Downloads 38640619 Multi-Temporal Analysis of Vegetation Change within High Contaminated Watersheds by Superfund Sites in Wisconsin
Authors: Punwath Prum
Abstract:
Superfund site is recognized publicly to be a severe environmental problem to surrounding communities and biodiversity due to its hazardous chemical waste from industrial activities. It contaminates the soil and water but also is a leading potential point-source pollution affecting ecosystem in watershed areas from chemical substances. The risks of Superfund site on watershed can be effectively measured by utilizing publicly available data and geospatial analysis by free and open source application. This study analyzed the vegetation change within high risked contaminated watersheds in Wisconsin. The high risk watersheds were measured by which watershed contained high number Superfund sites. The study identified two potential risk watersheds in Lafayette and analyzed the temporal changes of vegetation within the areas based on Normalized difference vegetation index (NDVI) analysis. The raster statistic was used to compare the change of NDVI value over the period. The analysis results showed that the NDVI value within the Superfund sites’ boundary has a significant lower value than nearby surrounding and provides an analogy for environmental hazard affect by the chemical contamination in Superfund site.Keywords: soil contamination, spatial analysis, watershed
Procedia PDF Downloads 14140618 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach
Authors: Sarisa Pinkham, Kanyarat Bussaban
Abstract:
The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.Keywords: daily rainfall, image processing, approximation, pixel value data
Procedia PDF Downloads 38940617 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management
Authors: Kenneth Harper
Abstract:
The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups
Procedia PDF Downloads 3040616 Fostering Students' Engagement with Historical Issues Surrounding the Field of Graphic Design
Authors: Sara Corvino
Abstract:
The aim of this study is to explore the potential of inclusive learning and assessment strategies to foster students' engagement with historical debates surrounding the field of graphic design. The goal is to respond to the diversity of L4 Graphic Design students, at Nottingham Trent University, in a way that instead of 'lowering standards' can benefit everyone. This research tests, measures, and evaluates the impact of a specific intervention, an assessment task, to develop students' critical visual analysis skills and stimulate a deeper engagement with the subject matter. Within the action research approach, this work has followed a case study research method to understand students' views and perceptions of a specific project. The primary methods of data collection have been: anonymous electronic questionnaire and a paper-based anonymous critical incident questionnaire. NTU College of Business Law and Social Sciences Research Ethics Committee granted the Ethical approval for this research in November 2019. Other methods used to evaluate the impact of this assessment task have been Evasys's report and students' performance. In line with the constructivist paradigm, this study embraces an interpretative and contextualized analysis of the collected data within the triangulation analytical framework. The evaluation of both qualitative and quantitative data demonstrates that active learning strategies and the disruption of thinking patterns can foster greater students' engagement and can lead to meaningful learning.Keywords: active learning, assessment for learning, graphic design, higher education, student engagement
Procedia PDF Downloads 18140615 Rainstorm Characteristics over the Northeastern Region of Thailand: Weather Radar Analysis
Authors: P. Intaracharoen, P. Chantraket, C. Detyothin, S. Kirtsaeng
Abstract:
Radar reflectivity data from Phimai weather radar station of DRRAA (Department of Royal Rainmaking and Agricultural Aviation) were used to analyzed the rainstorm characteristics via Thunderstorm Identification Tracking Analysis and Nowcasting (TITAN) algorithm. The Phimai weather radar station was situated at Nakhon Ratchasima province, northeastern Thailand. The data from 277 days of rainstorm events occurring from May 2016 to May 2017 were used to investigate temporal distribution characteristics of convective individual rainclouds. The important storm properties, structures, and their behaviors were analyzed by 9 variables as storm number, storm duration, storm volume, storm area, storm top, storm base, storm speed, storm orientation, and maximum storm reflectivity. The rainstorm characteristics were also examined by separating the data into two periods as wet and dry season followed by an announcement of TMD (Thai Meteorological Department), under the influence of southwest monsoon (SWM) and northeast monsoon (NEM). According to the characteristics of rainstorm results, it can be seen that rainstorms during the SWM influence were found to be the most potential rainstorms over northeastern region of Thailand. The SWM rainstorms are larger number of the storm (404, 140 no./day), storm area (34.09, 26.79 km²) and storm volume (95.43, 66.97 km³) than NEM rainstorms, respectively. For the storm duration, the average individual storm duration during the SWM and NEM was found a minor difference in both periods (47.6, 48.38 min) and almost all storm duration in both periods were less than 3 hours. The storm velocity was not exceeding 15 km/hr (13.34 km/hr for SWM and 10.67 km/hr for NEM). For the rainstorm reflectivity, it was found a little difference between wet and dry season (43.08 dBz for SWM and 43.72 dBz for NEM). It assumed that rainstorms occurred in both seasons have same raindrop size.Keywords: rainstorm characteristics, weather radar, TITAN, Northeastern Thailand
Procedia PDF Downloads 19540614 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data
Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri
Abstract:
In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification
Procedia PDF Downloads 51640613 Twitter Ego Networks and the Capital Markets: A Social Network Analysis Perspective of Market Reactions to Earnings Announcement Events
Authors: Gregory D. Saxton
Abstract:
Networks are everywhere: lunch ties among co-workers, golfing partnerships among employees, interlocking board-of-director connections, Facebook friendship ties, etc. Each network varies in terms of its structure -its size, how inter-connected network members are, and the prevalence of sub-groups and cliques. At the same time, within any given network, some network members will have a more important, more central position on account of their greater number of connections or their capacity as “bridges” connecting members of different network cliques. The logic of network structure and position is at the heart of what is known as social network analysis, and this paper applies this logic to the study of the stock market. Using an array of data analytics and machine learning tools, this study will examine 17 million Twitter messages discussing the stocks of the firms in the S&P 1,500 index in 2018. Each of these 1,500 stocks has a distinct Twitter discussion network that varies in terms of core network characteristics such as size, density, influence, norms and values, level of activity, and embedded resources. The study’s core proposition is that the ultimate effect of any market-relevant information is contingent on the characteristics of the network through which it flows. To test this proposition, this study operationalizes each of the core network characteristics and examines their influence on market reactions to 2018 quarterly earnings announcement events.Keywords: data analytics, investor-to-investor communication, social network analysis, Twitter
Procedia PDF Downloads 12340612 Teacher Professional Development –Current Practices in a Secondary School in Brunei Darussalam
Authors: Shanthi Thomas
Abstract:
This research paper presents the current practices of teacher professional development, perceived as beneficial by teachers themselves, in a private secondary school in Brunei Darussalam. This is part of the findings of a larger qualitative study on teacher empowerment, using ethnographic methods for data collection, i.e. participant observation, interviews and document analysis. The field work was carried out over a period of six months in 2013. An analysis of the field data revealed multiple pathways of teacher professional development existing in the school. The results indicate that school leaders, the teacher community in the school, students, and the teachers themselves were the agents in a school that facilitated teacher empowerment. Besides contributing to the knowledge base on teacher professional development, the results of this study provides directions for educational policy makers in their efforts to enhance professional development in secondary schools of similar characteristics. For school leaders and the teacher community, these findings offer guidelines for maximizing the opportunities for these professional development practices, by strengthening collegiality and by using the existing structures optimally for the benefit of all concerned.Keywords: colleagues and the wider teacher community, school leaders, self-driven professional development, teacher professional development
Procedia PDF Downloads 41440611 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R
Authors: Jaya Mathew
Abstract:
Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R
Procedia PDF Downloads 38040610 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders
Authors: Sven Gehrke, Johannes Ruhland
Abstract:
Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.Keywords: trust, data mining, CRISP DM, stakeholder management
Procedia PDF Downloads 9540609 The Efficiency of AFLP and ISSR Markers in Genetic Diversity Estimation and Gene Pool Classification of Iranian Landrace Bread Wheat (Triticum Aestivum L.) Germplasm
Authors: Reza Talebi
Abstract:
Wheat (Triticum aestivum) is one of the most important food staples in Iran. Understanding genetic variability among the landrace wheat germplasm is important for breeding. Landraces endemic to Iran are a genetic resource that is distinct from other wheat germplasm. In this study, 60 Iranian landrace wheat accessions were characterized AFLP and ISSR markers. Twelve AFLP primer pairs detected 128 polymorphic bands among the sixty genotypes. The mean polymorphism rate based on AFLP data was 31%; however, a wide polymorphism range among primer pairs was observed (22–40%). Polymorphic information content (PIC value) calculated to assess the informativeness of each marker ranged from 0.28 to 0.4, with a mean of 0.37. According to AFLP molecular data, cluster analysis grouped the genotypes in five distinct clusters. .ISSR markers generated 68 bands (average of 6 bands per primer), which 31 were polymorphic (45%) across the 60 wheat genotypes. Polymorphism information content (PIC) value for ISSR markers was calculated in the range of 0.14 to 0.48 with an average of 0.33. Based on data achieved by ISSR-PCR, cluster analysis grouped the genotypes in three distinct clusters. Both AFLP and ISSR markers able to showed that high level of genetic diversity in Iranian landrace wheat accessions has maintained a relatively constant level of genetic diversity during last years.Keywords: wheat, genetic diversity, AFLP, ISSR
Procedia PDF Downloads 45240608 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 18940607 Study of the Persian Gulf’s and Oman Sea’s Numerical Tidal Currents
Authors: Fatemeh Sadat Sharifi
Abstract:
In this research, a barotropic model was employed to consider the tidal studies in the Persian Gulf and Oman Sea, where the only sufficient force was the tidal force. To do that, a finite-difference, free-surface model called Regional Ocean Modeling System (ROMS), was employed on the data over the Persian Gulf and Oman Sea. To analyze flow patterns of the region, the results of limited size model of The Finite Volume Community Ocean Model (FVCOM) were appropriated. The two points were determined since both are one of the most critical water body in case of the economy, biology, fishery, Shipping, navigation, and petroleum extraction. The OSU Tidal Prediction Software (OTPS) tide and observation data validated the modeled result. Next, tidal elevation and speed, and tidal analysis were interpreted. Preliminary results determine a significant accuracy in the tidal height compared with observation and OTPS data, declaring that tidal currents are highest in Hormuz Strait and the narrow and shallow region between Iranian coasts and Islands. Furthermore, tidal analysis clarifies that the M_2 component has the most significant value. Finally, the Persian Gulf tidal currents are divided into two branches: the first branch converts from south to Qatar and via United Arab Emirate rotates to Hormuz Strait. The secondary branch, in north and west, extends up to the highest point in the Persian Gulf and in the head of Gulf turns counterclockwise.Keywords: numerical model, barotropic tide, tidal currents, OSU tidal prediction software, OTPS
Procedia PDF Downloads 13340606 Wireless Transmission of Big Data Using Novel Secure Algorithm
Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha
Abstract:
This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance
Procedia PDF Downloads 49240605 Comparative Analysis of Enzyme Activities Concerned in Decomposition of Toluene
Authors: Ayuko Itsuki, Sachiyo Aburatani
Abstract:
In recent years, pollutions of the environment by toxic substances become a serious problem. While there are many methods of environmental clean-up, the methods by microorganisms are considered to be reasonable and safety for environment. Compost is known that it catabolize the meladorous substancess in its production process, however the mechanism of its catabolizing system is not known yet. In the catabolization process, organic matters turn into inorganic by the released enzymes from lots of microorganisms which live in compost. In other words, the cooperative of activated enzymes in the compost decomposes malodorous substances. Thus, clarifying the interaction among enzymes is important for revealing the catabolizing system of meladorous substance in compost. In this study, we utilized statistical method to infer the interaction among enzymes. We developed a method which combined partial correlation with cross correlation to estimate the relevance between enzymes especially from time series data of few variables. Because of using cross correlation, we can estimate not only the associative structure but also the reaction pathway. We applied the developed method to the enzyme measured data and estimated an interaction among the enzymes in decomposition mechanism of toluene.Keywords: enzyme activities, comparative analysis, compost, toluene
Procedia PDF Downloads 27640604 The Inherent Flaw in the NBA Playoff Structure
Authors: Larry Turkish
Abstract:
Introduction: The NBA is an example of mediocrity and this will be evident in the following paper. The study examines and evaluates the characteristics of the NBA champions. As divisions and playoff teams increase, there is an increase in the probability that the champion originates from the mediocre category. Since it’s inception in 1947, the league has been mediocre and continues to this day. Why does a professional league allow any team with a less than 50% winning percentage into the playoffs? As long as the finances flow into the league, owners will not change the current algorithm. The objective of this paper is to determine if the regular season has meaning in finding an NBA champion. Statistical Analysis: The data originates from the NBA website. The following variables are part of the statistical analysis: Rank, the rank of a team relative to other teams in the league based on the regular season win-loss record; Winning Percentage of a team based on the regular season; Divisions, the number of divisions within the league and Playoff Teams, the number of playoff teams relative to a particular season. The following statistical applications are applied to the data: Pearson Product-Moment Correlation, Analysis of Variance, Factor and Regression analysis. Conclusion: The results indicate that the divisional structure and number of playoff teams results in a negative effect on the winning percentage of playoff teams. It also prevents teams with higher winning percentages from accessing the playoffs. Recommendations: 1. Teams that have a winning percentage greater than 1 standard deviation from the mean from the regular season will have access to playoffs. (Eliminates mediocre teams.) 2. Eliminate Divisions (Eliminates weaker teams from access to playoffs.) 3. Eliminate Conferences (Eliminates weaker teams from access to the playoffs.) 4. Have a balanced regular season schedule, (Reduces the number of regular season games, creates equilibrium, reduces bias) that will reduce the need for load management.Keywords: alignment, mediocrity, regression, z-score
Procedia PDF Downloads 13140603 Identification of Factors and Impacts on the Success of Implementing Extended Enterprise Resource Planning: Case Study of Manufacturing Industries in East Java, Indonesia
Authors: Zeplin Jiwa Husada Tarigan, Sautma Ronni Basana, Widjojo Suprapto
Abstract:
The ERP is integrating all data from various departments within the company into one data base. One department inputs the data and many other departments can access and use the data through the connected information system. As many manufacturing companies in Indonesia implement the ERP technology, many adjustments are to be made to align with the business process in the companies, especially the management policy and the competitive advantages. For companies that are successful in the initial implementation, they still have to maintain the process so that the initial success can develop along with the changing of business processes of the company. For companies which have already implemented the ERP successfully, they are still in need to maintain the system so that it can match up with the business development and changes. The continued success of the extended ERP implementation aims to achieve efficient and effective performance for the company. This research is distributing 100 questionnaires to manufacturing companies in East Java, Indonesia, which have implemented and have going live ERP for over five years. There are 90 returned questionnaires with ten disqualified questionnaires because they are from companies that implement ERP less than five years. There are only 80 questionnaires used as the data, with the response rate of 80%. Based on the data results and analysis with PLS (Partial Least Square), it is obtained that the organization commitment brings impacts to the user’s effectiveness and provides the adequate IT infrastructure. The user’s effectiveness brings impacts to the adequate IT infrastructure. The information quality of the company increases the implementation of the extended ERP in manufacturing companies in East Java, Indonesia.Keywords: organization commitment, adequate IT infrastructure, information quality, extended ERP implementation
Procedia PDF Downloads 16940602 Analysis and Re-Design Ergonomic Mineral Water Gallon Trolley
Authors: Dessy Laksyana Utami
Abstract:
Manual material handling activities often make it difficult for humans to work like this. Muscle injury due to incorrect posture.Workers need to facilitate their activities. One tool to assist their activities in the transportation of ordinary materials is a trolley. This tool is very useful because it can be used.It can bring many items without having to spend more energy to operate it. Very Comfortable used a trolley in the community. But the old design still have a complaint by worker, because lack of grip and capacity. After posture analysis with the REBA method, the value of risk need to be increased is obtained tool. Re design use Indonesian anthropometric data with the 50th percentile.Keywords: Material Handling, REBA method, postural assessment, Trolley.
Procedia PDF Downloads 13840601 End-User Behavior: Analysis of Their Role and Impacts on Energy Savings Achievements
Authors: Margarida Plana
Abstract:
End-users behavior has become one of the main aspects to be solved on energy efficiency projects. Especially on the residential sector, the end-users have a direct impact that affects the achievement of energy saving’s targets. This paper is focused on presenting and quantify the impact of end-users behavior on basis of the analysis of real projects’ data. The analysis study which is the role of buiding’s occupants and how their behavior can change the success of energy efficiency projects how to limit their impact. The results obtained show two main conclusions. The first one is easiest to solve: we need to control and limit the end-users interaction with the equipment operation to be able to reach the targets fixed. The second one: as the plugged equipment are increasing exponentially on the residential sector, big efforts of disseminations are needed in order to explain to citizens the impact of their day by day actions through dissemination campaigns.Keywords: end-users impacts, energy efficiency, energy savings, impact limitations
Procedia PDF Downloads 36240600 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques
Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian
Abstract:
Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.Keywords: data mining, k-means, road traffic accidents, Waze, Weka
Procedia PDF Downloads 418