Search results for: data acquisition (DAQ)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25433

Search results for: data acquisition (DAQ)

24683 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation

Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das

Abstract:

Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).

Keywords: clipping, compression, resolution, seismic scaling

Procedia PDF Downloads 466
24682 Association of Social Data as a Tool to Support Government Decision Making

Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias

Abstract:

Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.

Keywords: social data, government decision making, association of social data, data mining

Procedia PDF Downloads 366
24681 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation

Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang

Abstract:

Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.

Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven

Procedia PDF Downloads 4
24680 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 79
24679 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage

Authors: P. Jayashree, S. Rajkumar

Abstract:

With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.

Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding

Procedia PDF Downloads 290
24678 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 94
24677 Insertion of Photovoltaic Energy at Residential Level at Tegucigalpa and Comayagüela, Honduras

Authors: Tannia Vindel, Angel Matute, Erik Elvir, Kelvin Santos

Abstract:

Currently in Honduras, is been incentivized the generation of energy using renewable fonts, such as: hydroelectricity, wind power, biomass and, more recently with the strongest growth, photovoltaic energy. In July 2015 were installed 455.2 MW of photovoltaic energy, increasing by 24% the installed capacity of the national interconnected system existing in 2014, according the National Energy Company (NEC), that made possible reduce the thermoelectric dependency of the system. Given the good results of those large-scale photovoltaic plants, arises the question: is it interesting for the distribution utility and for the consumers the integration of photovoltaic systems in micro-scale in the urban and rural areas? To answer that question has been researched the insertion of photovoltaic energy in the residential sector in Tegucigalpa and Comayagüela (Central District), Honduras to determine the technical and economic viability. Francisco Morazán department, according the National Statistics Institute (NSI), in 2001 had more than 180,000 houses with power service. Tegucigalpa, department and Honduras capital, and Comayagüela, both, have the highest population density in the region, with 1,300,000 habitants in 2014 (NSI). The residential sector in the south-central region of Honduras represents a high percentage being 49% of total consumption, according with NEC in 2014; where 90% of this sector consumes in a range of 0 to 300 kWh / month. All this, in addition to the high level of losses in the transmission and distribution systems, 31.3% in 2014, and the availability of an annual average solar radiation of 5.20 kWh/(m2∙day) according to the NASA, suggests the feasibility of the implementation of photovoltaic systems as a solution to give a level of independency to the households, and besides could be capable of injecting the non-used energy to the grid. The capability of exchange of energy with the grid could make the photovoltaic systems acquisition more affordable to the consumers, because of the compensation energy programs or other kinds of incentives that could be created. Technical viability of the photovoltaic systems insertion has been analyzed, considering the solar radiation monthly average to determine the monthly average of energy that would be generated with the technology accessible locally and the effects of the injection of the energy locally generated on the grid. In addition, the economic viability has been analyzed too, considering the photovoltaic systems high costs, costs of the utility, location and monthly energy consumption requirements of the families. It was found that the inclusion of photovoltaic systems in Tegucigalpa and Comayagüela could decrease in 6 MW the demand for the region if 100% of the households use photovoltaic systems, which acquisition may be more accessible with the help of government incentives and/or the application of energy exchange programs.

Keywords: grid connected, photovoltaic, residential, technical analysis

Procedia PDF Downloads 257
24676 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data

Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif

Abstract:

Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.

Keywords: field data, local scour, scour equation, wide piers

Procedia PDF Downloads 402
24675 A Learning Process for Aesthetics of Language in Thai Poetry for High School Teachers

Authors: Jiraporn Adchariyaprasit

Abstract:

The aesthetics of language in Thai poetry are emerged from the combination of sounds and meanings. The appreciation of such beauty can be achieved by means of education, acquisition of knowledge, and training. This research aims to study the learning process of aesthetics of language in Thai poetry for high school teachers in Bangkok and nearby provinces. There are 10 samples selected by purposive sampling for in-depth interviews. According to the research, there are four patterns in the learning process of aesthetics of language in Thai poetry which are 1) the study of characteristics and patterns of poetry, 2) the training of poetic reading, 3) the study of social and cultural contexts of poetry’s creation, and 4) the study of other sciences related to poetry such as linguistics, traditional dance, and so on.

Keywords: aesthetics, poetry, Thai poetry, poetry learning

Procedia PDF Downloads 429
24674 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol

Authors: Inkyu Kim, SangMan Moon

Abstract:

This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.

Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application

Procedia PDF Downloads 388
24673 Methods for Distinction of Cattle Using Supervised Learning

Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl

Abstract:

Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.

Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning

Procedia PDF Downloads 544
24672 Router 1X3 - RTL Design and Verification

Authors: Nidhi Gopal

Abstract:

Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.

Keywords: data packets, networking, router, routing

Procedia PDF Downloads 806
24671 Numerical Simulation and Laboratory Tests for Rebar Detection in Reinforced Concrete Structures using Ground Penetrating Radar

Authors: Maha Al-Soudani, Gilles Klysz, Jean-Paul Balayssac

Abstract:

The aim of this paper is to use Ground Penetrating Radar (GPR) as a non-destructive testing (NDT) method to increase its accuracy in recognizing the geometric reinforced concrete structures and in particular, the position of steel bars. This definition will help the managers to assess the state of their structures on the one hand vis-a-vis security constraints and secondly to quantify the need for maintenance and repair. Several configurations of acquisition and processing of the simulated signal were tested to propose and develop an appropriate imaging algorithm in the propagation medium to locate accurately the rebar. A subsequent experimental validation was used by testing the imaging algorithm on real reinforced concrete structures. The results indicate that, this algorithm is capable of estimating the reinforcing steel bar position to within (0-1) mm.

Keywords: GPR, NDT, Reinforced concrete structures, Rebar location.

Procedia PDF Downloads 498
24670 Comparison of Iodine Density Quantification through Three Material Decomposition between Philips iQon Dual Layer Spectral CT Scanner and Siemens Somatom Force Dual Source Dual Energy CT Scanner: An in vitro Study

Authors: Jitendra Pratap, Jonathan Sivyer

Abstract:

Introduction: Dual energy/Spectral CT scanning permits simultaneous acquisition of two x-ray spectra datasets and can complement radiological diagnosis by allowing tissue characterisation (e.g., uric acid vs. non-uric acid renal stones), enhancing structures (e.g. boost iodine signal to improve contrast resolution), and quantifying substances (e.g. iodine density). However, the latter showed inconsistent results between the 2 main modes of dual energy scanning (i.e. dual source vs. dual layer). Therefore, the present study aimed to determine which technology is more accurate in quantifying iodine density. Methods: Twenty vials with known concentrations of iodine solutions were made using Optiray 350 contrast media diluted in sterile water. The concentration of iodine utilised ranged from 0.1 mg/ml to 1.0mg/ml in 0.1mg/ml increments, 1.5 mg/ml to 4.5 mg/ml in 0.5mg/ml increments followed by further concentrations at 5.0 mg/ml, 7mg/ml, 10 mg/ml and 15mg/ml. The vials were scanned using Dual Energy scan mode on a Siemens Somatom Force at 80kV/Sn150kV and 100kV/Sn150kV kilovoltage pairing. The same vials were scanned using Spectral scan mode on a Philips iQon at 120kVp and 140kVp. The images were reconstructed at 5mm thickness and 5mm increment using Br40 kernel on the Siemens Force and B Filter on Philips iQon. Post-processing of the Dual Energy data was performed on vendor-specific Siemens Syngo VIA (VB40) and Philips Intellispace Portal (Ver. 12) for the Spectral data. For each vial and scan mode, the iodine concentration was measured by placing an ROI in the coronal plane. Intraclass correlation analysis was performed on both datasets. Results: The iodine concentrations were reproduced with a high degree of accuracy for Dual Layer CT scanner. Although the Dual Source images showed a greater degree of deviation in measured iodine density for all vials, the dataset acquired at 80kV/Sn150kV had a higher accuracy. Conclusion: Spectral CT scanning by the dual layer technique has higher accuracy for quantitative measurements of iodine density compared to the dual source technique.

Keywords: CT, iodine density, spectral, dual-energy

Procedia PDF Downloads 117
24669 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 262
24668 Development of Self-Reliant Satellite-Level Propulsion System by Using Hydrogen Peroxide Propellant

Authors: H. J. Liu, Y. A. Chan, C. K. Pai, K. C. Tseng, Y. H. Chen, Y. L. Chan, T. C. Kuo

Abstract:

To satisfy the mission requirement of the FORMOSAT-7 project, NSPO has initialized a self-reliant development on satellite propulsion technology. A trade-off study on different types of on-board propulsion system has been done. A green propellant, high-concentration hydrogen peroxide (H2O2 hereafter), is chosen in this research because it is ITAR-free, nontoxic and easy to produce. As the components designed for either cold gas or hydrazine propulsion system are not suitable for H2O2 propulsion system, the primary objective of the research is to develop the components compatible with H2O2. By cooperating with domestic research institutes and manufacturing vendors, several prototype components, including a diaphragm-type tank, pressure transducer, ball latching valve, and one-Newton thruster with catalyst bed, were manufactured, and the functional tests were performed successfully according to the mission requirements. The requisite environmental tests, including hot firing test, thermal vaccum test, vibration test and compatibility test, are prepared and will be to completed in the near future. To demonstrate the subsystem function, an Air-Bearing Thrust Stand (ABTS) and a real-time Data Acquisition & Control System (DACS) were implemented to assess the performance of the proposed H2O2 propulsion system. By measuring the distance that the thrust stand has traveled in a given time, the thrust force can be derived from the kinematics equation. To validate the feasibility of the approach, it is scheduled to assess the performance of a cold gas (N2) propulsion system prior to the H2O2 propulsion system.

Keywords: FORMOSAT-7, green propellant, Hydrogen peroxide, thruster

Procedia PDF Downloads 425
24667 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study

Authors: Zeba Mahmood

Abstract:

The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.

Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining

Procedia PDF Downloads 533
24666 An Analysis of Laboratory Management Practices and Laid down Standard in Some Colleges of Education in Kano State, Nigeria

Authors: Joseph Abiodun Ayo

Abstract:

The main purpose of this study was to investigate the science laboratory management practices employed in some colleges of education in Kano State, Nigeria. Four specific objectives were stated to guide the study, four research questions were investigated, four null hypothesis were tested at 0.05 level of significance. A survey design was used and science laboratory management questionnaires which solicit responses that was used in answering the research questions and testing of hypotheses. These questionnaires were distributed to the respective respondents in the sampled colleges. The respondents for the study comprised biology chemistry, physics, integrated science teacher trainers and the paraprofessionals. Data were analyzed using mean and standard deviation to answer the questions. Chi-square statistical technique was used to test the hypothesis. The findings of the study revealed that all procedures on control of laboratory activities were rarely observed. Safety procedures were occasionally practiced. On provision and procurement of laboratory equipment and materials it was observed that both academic and the paraprofessional were not fully involved. While maintenance measures were occasionally observed, furthermore science laboratory management procedures are not frequently practiced. Hence making the acquisition of science process skills by students becoming difficult. To arrest these anomalies, it is recommended that direct labor in the maintenance of laboratory equipment and other apparatus by paraprofessional is crucial. Training of academic and paraprofessional through workshops to acquire technical skills in maintenance of science laboratory equipment be instituted to increase professionalism. Periodic supervision of activities in the science laboratories should be done promptly.

Keywords: laboratory, management, standard, facility

Procedia PDF Downloads 433
24665 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 345
24664 Potentials for Learning History through Role-Playing in Virtual Reality: An Exploratory Study on Role-Playing on a Virtual Heritage Site

Authors: Danzhao Cheng, Eugene Ch'ng

Abstract:

Virtual Reality technologies can reconstruct cultural heritage objects and sites to a level of realism. Concentrating mostly on documenting authentic data and accurate representations of tangible contents, current virtual heritage is limited to accumulating visually presented objects. Such constructions, however, are fragmentary and may not convey the inherent significance of heritage in a meaningful way. In order to contextualise fragmentary historical contents where history can be told, a strategy is to create a guided narrative via role-playing. Such an approach can strengthen the logical connections of cultural elements and facilitate creative synthesis within the virtual world. This project successfully reconstructed the Ningbo Sanjiangkou VR site in Yuan Dynasty combining VR technology and role-play game approach. The results with 80 pairs of participants suggest that VR role-playing can be beneficial in a number of ways. Firstly, it creates thematic interactivity which encourages users to explore the virtual heritage in a more entertaining way with task-oriented goals. Secondly, the experience becomes highly engaging since users can interpret a historical context through the perspective of specific roles that exist in past societies. Thirdly, personalisation allows open-ended sequences of the expedition, reinforcing user’s acquisition of procedural knowledge relative to the cultural domain. To sum up, role-playing in VR poses great potential for experiential learning as it allows users to interpret a historical context in a more entertaining way.

Keywords: experiential learning, maritime silk road, role-playing, virtual heritage, virtual reality

Procedia PDF Downloads 162
24663 Adult and Non Formal Education for the Attainment of Enterprenuerial Skills in Nigeria

Authors: Zulaiha Maluma Ahmad

Abstract:

This paper attempted to examine adult and non formal education for the attainment of entrepreneurial skills in empowering the citizens with entrepreneurial skills, for Nigeria’s socioeconomic development. This paper highlighted the meaning of education in the context of skill acquisition, entrepreneurial education, adult and non formal education. It also examined the objectives, issues and challenges as well as prospects of this type of education. It further discussed the role of adult and non formal education for the attainment of socioeconomic development of a growing nation like Nigeria. The paper equally proffered some recommendations and eventually concluded that adult and non formal education can indeed make self reliance, personal satisfaction and the attainment of entrepreneurial education for the socioeconomic development of any nation, possible.

Keywords: entrepreneurial education, adult education, non formal education skills, Nigeria

Procedia PDF Downloads 589
24662 L2 Exposure Environment, Teaching Skills, and Beliefs about Learners’ Out-of-Class Learning: A Survey on Teachers of English as a Foreign Language

Authors: Susilo Susilo

Abstract:

In the process of foreign language acquisition, L2 exposure has been evidently assumed efficient for learners to help increase their proficiency. However, to get enough L2 exposure in the context of learning English as a foreign language is not as easy as that of the first language learning context. Therefore, beyond the classroom L2 exposure is helpful for EFL learners to achieve the language tasks. Alongside the rapid development of technology and media, English as a foreign language is virtually used in the social media of almost all regions, affecting the faces of Teaching English as a Foreign Language (TEFL). This different face of TEFL unavoidably intrigues teachers to treat their students differently in the classroom in order that they can put more effort in maximizing beyond-the-class learning to help improve their in-class achievements. The study aims to investigate: 1) EFL teachers’ teaching skills and beliefs about students’ out-of-class activities in different L2 exposure environments, and 2) the effect on EFL teachers’ teaching skills and beliefs about students’ out-of-class activities of different L2 exposure environments. This is a survey for 80 EFL teachers from Senior High Schools in three regions of two provinces in Indonesia. A questionnaire using a four-point Likert scale was distributed to the respondents to elicit data. The questionnaires were developed by reffering to the constructs of teaching skills (i.e. teaching preparation, teaching action, and teaching evaluation) and beliefs about out-of-class learning (i.e. setting, process and atmosphere), which have been taken from some expert definitions. The internal consistencies for those constructs were examined by using Cronbach Alpha. The data of the study were analyzed by using SPSS program, i.e. descriptive statistics and independent sample t-test. The standard for determining the significance was p < .05. The results revealed that: 1) teaching skills performed by the teachers of English as a foreign language in different exposure environments showed various focus of teaching skills, 2) the teachers showed various ways of beliefs about students’ out-of-class activities in different exposure environments, 3) there was a significant difference in the scores for NNESTs’ teaching skills in urban regions (M=34.5500, SD=4.24838) and those in rural schools (M=24.9500, SD=2.42794) conditions; t (78)=12.408, p = 0.000; and 4) there was a significant difference in the scores for NNESTs’ beliefs about students’ out-of-class activities in urban schools (M=36.9250, SD=6.17434) and those in rural regions (M=29.4250, SD=4.56793) conditions; t (78)=6.176, p = 0.000. These results suggest that different L2 exposure environments really do have effects on teachers’ teaching skills and beliefs about their students’ out-of-class learning.

Keywords: belief about EFL out-of-class learning, L2 exposure environment, teachers of English as a foreign language, teaching skills

Procedia PDF Downloads 339
24661 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 161
24660 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data

Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau

Abstract:

Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.

Keywords: calcium imaging, computer vision, neural activity, neural networks

Procedia PDF Downloads 79
24659 Development of a Smart System for Measuring Strain Levels of Natural Gas and Petroleum Pipelines on Earthquake Fault Lines in Turkiye

Authors: Ahmet Yetik, Seyit Ali Kara, Cevat Özarpa

Abstract:

Load changes occur on natural gas and oil pipelines due to natural disasters. The displacement of the soil around the natural gas and oil pipes due to situations that may cause erosion, such as earthquakes, landslides, and floods, is the source of this load change. The exposure of natural gas and oil pipes to variable loads causes deformation, cracks, and breaks in these pipes. Cracks and breaks on the pipes cause damage to people and the environment due to reasons such as explosions. Especially with the examinations made after natural disasters, it can be easily understood which of the pipes has more damage in the regions followed. It has been determined that the earthquakes in Turkey caused permanent damage to the pipelines. This project was designed and realized because it was determined that there were cracks and gas leaks in the insulation gaskets placed in the pipelines, especially at the junction points. In this study, A new SCADA (Supervisory Control and Data Acquisition) application has been developed to monitor load changes caused by natural disasters. The newly developed SCADA application monitors the changes in the x, y, and z axes of the stresses occurring in the pipes with the help of strain gauge sensors placed on the pipes. For the developed SCADA system, test setups in accordance with the standards were created during the fieldwork. The test setups created were integrated into the SCADA system, and the system was followed up. Thanks to the SCADA system developed with the field application, the load changes that will occur on the natural gas and oil pipes are instantly monitored, and the accumulations that may create a load on the pipes and their surroundings are immediately intervened, and new risks that may arise are prevented. It has contributed to energy supply security, asset management, pipeline holistic management, and sustainability.

Keywords: earthquake, natural gas pipes, oil pipes, strain measurement, stress measurement, landslide

Procedia PDF Downloads 68
24658 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 81
24657 Secure Multiparty Computations for Privacy Preserving Classifiers

Authors: M. Sumana, K. S. Hareesha

Abstract:

Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.

Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data

Procedia PDF Downloads 408
24656 The Politics of Land Grabbing in Ethiopia

Authors: Esayas Geleta

Abstract:

Within the last two decades in many sub-Saharan African countries, a large-scale acquisition (lease, concession, outright purchase) of extensive areas of farmland commonly labeled as ‘idle’ and ‘under-utilized’ has resulted in displacement and dispossession and dispossession without ‘compensation.’ This paper seeks to critically illustrate the processes and the consequences of the ‘land grabbing project’ in Ethiopia. Drawing on the theory of participatory development and empirical studies undertaken in Ethiopia, the paper elucidates the power dynamics that influence how and why dislocation and dispossession occur. The paper then demonstrates why the land-grabbing project, which was hugely supported by many international organizations, has largely failed in Ethiopia. Through a critical analysis of the process of ‘land grabbing’ in Ethiopia, the paper contributes to a more adequate and critical understanding of contemporary land deals and their social and environmental consequences.

Keywords: land grabbing, human rights, dispossession, resistance, governance

Procedia PDF Downloads 79
24655 Dual Set Point Governor Control Structure with Common Optimum Temporary Droop Settings for both Islanded and Grid Connected Modes

Authors: Deepen Sharma, Eugene F. Hill

Abstract:

For nearly 100 years, hydro-turbine governors have operated with only a frequency set point. This natural governor action means that the governor responds with changing megawatt output to disturbances in system frequency. More and more, power system managers are demanding that governors operate with constant megawatt output. One way of doing this is to introduce a second set point in the control structure called a power set point. The control structure investigated and analyzed in this paper is unique in the way that it utilizes a power reference set point in addition to the conventional frequency reference set point. An optimum set of temporary droop parameters derived based on the turbine-generator inertia constant and the penstock water start time for stable islanded operation are shown to be also equally applicable for a satisfactory rate of generator loading during its grid connected mode. A theoretical development shows why this is the case. The performance of the control structure has been investigated and established based on the simulation study made in MATLAB/Simulink as well as through testing the real time controller performance on a 15 MW Kaplan Turbine and generator. Recordings have been made using the labVIEW data acquisition platform. The hydro-turbine governor control structure and its performance investigated in this paper thus eliminates the need to have a separate set of temporary droop parameters, one valid for islanded mode and the other for interconnected operations mode.

Keywords: frequency set point, hydro governor, interconnected operation, isolated operation, power set point

Procedia PDF Downloads 365
24654 Sfard’s Commognitive Framework as a Method of Discourse Analysis in Mathematics

Authors: Dong-Joong Kim, Sangho Choi, Woong Lim

Abstract:

This paper discusses Sfard’s commognitive approach and provides an empirical study as an example to illustrate the theory as method. Traditionally, research in mathematics education focused on the acquisition of mathematical knowledge and the didactic process of knowledge transfer. Through attending to a distinctive form of language in mathematics, as well as mathematics as a discursive subject, alternative views of making meaning in mathematics have emerged; these views are therefore “critical,” as in critical discourse analysis. The commognitive discourse analysis method has the potential to bring more clarity to our understanding of students’ mathematical thinking and the process through which students are socialized into school mathematics.

Keywords: commognitive framework, discourse analysis, mathematical discourse, mathematics education

Procedia PDF Downloads 327