Search results for: data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24969

Search results for: data acquisition

24399 Acceptance of Big Data Technologies and Its Influence towards Employee’s Perception on Job Performance

Authors: Jia Yi Yap, Angela S. H. Lee

Abstract:

With the use of big data technologies, organization can get result that they are interested in. Big data technologies simply load all the data that is useful for the organizations and provide organizations a better way of analysing data. The purpose of this research is to get employees’ opinion from films in Malaysia to explore the use of big data technologies in their organization in order to provide how it may affect the perception of the employees on job performance. Therefore, in order to identify will accepting big data technologies in the organization affect the perception of the employee, questionnaire will be distributed to different employee from different Small and medium-sized enterprises (SME) organization listed in Malaysia. The conceptual model proposed will test with other variables in order to see the relationship between variables.

Keywords: big data technologies, employee, job performance, questionnaire

Procedia PDF Downloads 279
24398 Data Poisoning Attacks on Federated Learning and Preventive Measures

Authors: Beulah Rani Inbanathan

Abstract:

In the present era, it is vivid from the numerous outcomes that data privacy is being compromised in various ways. Machine learning is one technology that uses the centralized server, and then data is given as input which is being analyzed by the algorithms present on this mentioned server, and hence outputs are predicted. However, each time the data must be sent by the user as the algorithm will analyze the input data in order to predict the output, which is prone to threats. The solution to overcome this issue is federated learning, where the models alone get updated while the data resides on the local machine and does not get exchanged with the other local models. Nevertheless, even on these local models, there are chances of data poisoning, and it is crystal clear from various experiments done by many people. This paper delves into many ways where data poisoning occurs and the many methods through which it is prevalent that data poisoning still exists. It includes the poisoning attacks on IoT devices, Edge devices, Autoregressive model, and also, on Industrial IoT systems and also, few points on how these could be evadible in order to protect our data which is personal, or sensitive, or harmful when exposed.

Keywords: data poisoning, federated learning, Internet of Things, edge computing

Procedia PDF Downloads 73
24397 Using Satellite Images Datasets for Road Intersection Detection in Route Planning

Authors: Fatma El-Zahraa El-Taher, Ayman Taha, Jane Courtney, Susan Mckeever

Abstract:

Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions, is critical to decisions such as crossing roads or selecting the safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer the state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset is examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of the detection of intersections in satellite images is evaluated.

Keywords: satellite images, remote sensing images, data acquisition, autonomous vehicles

Procedia PDF Downloads 123
24396 Roadway Infrastructure and Bus Safety

Authors: Richard J. Hanowski, Rebecca L. Hammond

Abstract:

Very few studies have been conducted to investigate safety issues associated with motorcoach/bus operations. The current study investigates the impact that roadway infrastructure, including locality, roadway grade, traffic flow and traffic density, have on bus safety. A naturalistic driving study was conducted in the U.S.A that involved 43 motorcoaches. Two fleets participated in the study and over 600,000 miles of naturalistic driving data were collected. Sixty-five bus drivers participated in this study; 48 male and 17 female. The average age of the drivers was 49 years. A sophisticated data acquisition system (DAS) was installed on each of the 43 motorcoaches and a variety of kinematic and video data were continuously recorded. The data were analyzed by identifying safety critical events (SCEs), which included crashes, near-crashes, crash-relevant conflicts, and unintentional lane deviations. Additionally, baseline (normative driving) segments were also identified and analyzed for comparison to the SCEs. This presentation highlights the need for bus safety research and the methods used in this data collection effort. With respect to elements of roadway infrastructure, this study highlights the methods used to assess locality, roadway grade, traffic flow, and traffic density. Locality was determined by manual review of the recorded video for each event and baseline and was characterized in terms of open country, residential, business/industrial, church, playground, school, urban, airport, interstate, and other. Roadway grade was similarly determined through video review and characterized in terms of level, grade up, grade down, hillcrest, and dip. The video was also used to make a determination of the traffic flow and traffic density at the time of the event or baseline segment. For traffic flow, video was used to assess which of the following best characterized the event or baseline: not divided (2-way traffic), not divided (center 2-way left turn lane), divided (median or barrier), one-way traffic, or no lanes. In terms of traffic density, level-of-service categories were used: A1, A2, B, C, D, E, and F. Highlighted in this abstract are only a few of the many roadway elements that were coded in this study. Other elements included lighting levels, weather conditions, roadway surface conditions, relation to junction, and roadway alignment. Note that a key component of this study was to assess the impact that driver distraction and fatigue have on bus operations. In this regard, once the roadway elements had been coded, the primary research questions that were addressed were (i) “What environmental condition are associated with driver choice of engagement in tasks?”, and (ii) “what are the odds of being in a SCE while engaging in tasks while encountering these conditions?”. The study may be of interest to researchers and traffic engineers that are interested in the relationship between roadway infrastructure elements and safety events in motorcoach bus operations.

Keywords: bus safety, motorcoach, naturalistic driving, roadway infrastructure

Procedia PDF Downloads 170
24395 Energy System Analysis Using Data-Driven Modelling and Bayesian Methods

Authors: Paul Rowley, Adam Thirkill, Nick Doylend, Philip Leicester, Becky Gough

Abstract:

The dynamic performance of all energy generation technologies is impacted to varying degrees by the stochastic properties of the wider system within which the generation technology is located. This stochasticity can include the varying nature of ambient renewable energy resources such as wind or solar radiation, or unpredicted changes in energy demand which impact upon the operational behaviour of thermal generation technologies. An understanding of these stochastic impacts are especially important in contexts such as highly distributed (or embedded) generation, where an understanding of issues affecting the individual or aggregated performance of high numbers of relatively small generators is especially important, such as in ESCO projects. Probabilistic evaluation of monitored or simulated performance data is one technique which can provide an insight into the dynamic performance characteristics of generating systems, both in a prognostic sense (such as the prediction of future performance at the project’s design stage) as well as in a diagnostic sense (such as in the real-time analysis of underperforming systems). In this work, we describe the development, application and outcomes of a new approach to the acquisition of datasets suitable for use in the subsequent performance and impact analysis (including the use of Bayesian approaches) for a number of distributed generation technologies. The application of the approach is illustrated using a number of case studies involving domestic and small commercial scale photovoltaic, solar thermal and natural gas boiler installations, and the results as presented show that the methodology offers significant advantages in terms of plant efficiency prediction or diagnosis, along with allied environmental and social impacts such as greenhouse gas emission reduction or fuel affordability.

Keywords: renewable energy, dynamic performance simulation, Bayesian analysis, distributed generation

Procedia PDF Downloads 479
24394 Simulation and Hardware Implementation of Data Communication Between CAN Controllers for Automotive Applications

Authors: R. M. Kalayappan, N. Kathiravan

Abstract:

In automobile industries, Controller Area Network (CAN) is widely used to reduce the system complexity and inter-task communication. Therefore, this paper proposes the hardware implementation of data frame communication between one controller to other. The CAN data frames and protocols will be explained deeply, here. The data frames are transferred without any collision or corruption. The simulation is made in the KEIL vision software to display the data transfer between transmitter and receiver in CAN. ARM7 micro-controller is used to transfer data’s between the controllers in real time. Data transfer is verified using the CRO.

Keywords: control area network (CAN), automotive electronic control unit, CAN 2.0, industry

Procedia PDF Downloads 386
24393 Improving the Statistics Nature in Research Information System

Authors: Rajbir Cheema

Abstract:

In order to introduce an integrated research information system, this will provide scientific institutions with the necessary information on research activities and research results in assured quality. Since data collection, duplication, missing values, incorrect formatting, inconsistencies, etc. can arise in the collection of research data in different research information systems, which can have a wide range of negative effects on data quality, the subject of data quality should be treated with better results. This paper examines the data quality problems in research information systems and presents the new techniques that enable organizations to improve their quality of research information.

Keywords: Research information systems (RIS), research information, heterogeneous sources, data quality, data cleansing, science system, standardization

Procedia PDF Downloads 141
24392 Data Mining Meets Educational Analysis: Opportunities and Challenges for Research

Authors: Carla Silva

Abstract:

Recent development of information and communication technology enables us to acquire, collect, analyse data in various fields of socioeconomic – technological systems. Along with the increase of economic globalization and the evolution of information technology, data mining has become an important approach for economic data analysis. As a result, there has been a critical need for automated approaches to effective and efficient usage of massive amount of educational data, in order to support institutions to a strategic planning and investment decision-making. In this article, we will address data from several different perspectives and define the applied data to sciences. Many believe that 'big data' will transform business, government, and other aspects of the economy. We discuss how new data may impact educational policy and educational research. Large scale administrative data sets and proprietary private sector data can greatly improve the way we measure, track, and describe educational activity and educational impact. We also consider whether the big data predictive modeling tools that have emerged in statistics and computer science may prove useful in educational and furthermore in economics. Finally, we highlight a number of challenges and opportunities for future research.

Keywords: data mining, research analysis, investment decision-making, educational research

Procedia PDF Downloads 339
24391 A Method of Detecting the Difference in Two States of Brain Using Statistical Analysis of EEG Raw Data

Authors: Digvijaysingh S. Bana, Kiran R. Trivedi

Abstract:

This paper introduces various methods for the alpha wave to detect the difference between two states of brain. One healthy subject participated in the experiment. EEG was measured on the forehead above the eye (FP1 Position) with reference and ground electrode are on the ear clip. The data samples are obtained in the form of EEG raw data. The time duration of reading is of one minute. Various test are being performed on the alpha band EEG raw data.The readings are performed in different time duration of the entire day. The statistical analysis is being carried out on the EEG sample data in the form of various tests.

Keywords: electroencephalogram(EEG), biometrics, authentication, EEG raw data

Procedia PDF Downloads 450
24390 The First Language of Humanity is Body Language Neither Mother or Native Language

Authors: Badriah Khaleel

Abstract:

Language acquisition is one of the most striking aspects of human development. It is a startling feat, which has engrossed the attention of linguists for generations. The present study will explore the hidden identities and attributes of nonverbal gestures. The current research will reflect the significant role of body language as not mere body gestures or facial expressions but as the first language of humanity.

Keywords: a startling feat, a new horizon for linguists to rethink, explore the hidden identities and attributes of non-verbal gestures, English as a third language, the first language of humanity

Procedia PDF Downloads 485
24389 A Study on Big Data Analytics, Applications and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, Healthcare, and business intelligence contain voluminous and incremental data, which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organization's decision-making strategy can be enhanced using big data analytics and applying different machine learning techniques and statistical tools on such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates on various frameworks in the process of Analysis using different machine-learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 64
24388 A Study on Big Data Analytics, Applications, and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, healthcare, and business intelligence contain voluminous and incremental data which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organisation decision-making strategy can be enhanced by using big data analytics and applying different machine learning techniques and statistical tools to such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates various frameworks in the process of analysis using different machine learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 79
24387 Improved K-Means Clustering Algorithm Using RHadoop with Combiner

Authors: Ji Eun Shin, Dong Hoon Lim

Abstract:

Data clustering is a common technique used in data analysis and is used in many applications, such as artificial intelligence, pattern recognition, economics, ecology, psychiatry and marketing. K-means clustering is a well-known clustering algorithm aiming to cluster a set of data points to a predefined number of clusters. In this paper, we implement K-means algorithm based on MapReduce framework with RHadoop to make the clustering method applicable to large scale data. RHadoop is a collection of R packages that allow users to manage and analyze data with Hadoop. The main idea is to introduce a combiner as a function of our map output to decrease the amount of data needed to be processed by reducers. The experimental results demonstrated that K-means algorithm using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also showed that our K-means algorithm using RHadoop with combiner was faster than regular algorithm without combiner as the size of data set increases.

Keywords: big data, combiner, K-means clustering, RHadoop

Procedia PDF Downloads 415
24386 Open Circuit MPPT Control Implemented for PV Water Pumping System

Authors: Rabiaa Gammoudi, Najet Rebei, Othman Hasnaoui

Abstract:

Photovoltaic systems use different techniques for tracking the Maximum Power Point (MPPT) to provide the highest possible power to the load regardless of the climatic conditions variation. In this paper, the proposed method is the Open Circuit (OC) method with sudden and random variations of insolation. The simulation results of the water pumping system controlled by OC method are validated by an experimental experience in real-time using a test bench composed by a centrifugal pump powered by a PVG via a boost chopper for the adaptation between the source and the load. The output of the DC/DC converter supplies the motor pump LOWARA type, assembly by means of a DC/AC inverter. The control part is provided by a computer incorporating a card DS1104 running environment Matlab/Simulink for visualization and data acquisition. These results show clearly the effectiveness of our control with a very good performance. The results obtained show the usefulness of the developed algorithm in solving the problem of degradation of PVG performance depending on the variation of climatic factors with a very good yield.

Keywords: PVWPS (PV Water Pumping System), maximum power point tracking (MPPT), open circuit method (OC), boost converter, DC/AC inverter

Procedia PDF Downloads 435
24385 Improving Learning Abilities and Inclusion through Movement: The Movi-Mente© Method

Authors: Ivan Traina, Luigi Sangalli, Fabio Tognon, Angelo Lascioli

Abstract:

Currently, challenges regarding preschooler children are mainly focused on a sedentary lifestyle. Also, motor activity in infancy is seen as a tool for the separate acquisition of cognitive and socio-emotional skills rather than considering neuromotor development as a tool for improving learning abilities. The paper utilized an observational research method to shed light on the results of practicing neuromotor exercises in preschool children with disability as well as provide implications for practice.

Keywords: children with disability, learning abilities, inclusion, neuromotor development

Procedia PDF Downloads 136
24384 Framework for Integrating Big Data and Thick Data: Understanding Customers Better

Authors: Nikita Valluri, Vatcharaporn Esichaikul

Abstract:

With the popularity of data-driven decision making on the rise, this study focuses on providing an alternative outlook towards the process of decision-making. Combining quantitative and qualitative methods rooted in the social sciences, an integrated framework is presented with a focus on delivering a much more robust and efficient approach towards the concept of data-driven decision-making with respect to not only Big data but also 'Thick data', a new form of qualitative data. In support of this, an example from the retail sector has been illustrated where the framework is put into action to yield insights and leverage business intelligence. An interpretive approach to analyze findings from both kinds of quantitative and qualitative data has been used to glean insights. Using traditional Point-of-sale data as well as an understanding of customer psychographics and preferences, techniques of data mining along with qualitative methods (such as grounded theory, ethnomethodology, etc.) are applied. This study’s final goal is to establish the framework as a basis for providing a holistic solution encompassing both the Big and Thick aspects of any business need. The proposed framework is a modified enhancement in lieu of traditional data-driven decision-making approach, which is mainly dependent on quantitative data for decision-making.

Keywords: big data, customer behavior, customer experience, data mining, qualitative methods, quantitative methods, thick data

Procedia PDF Downloads 141
24383 Diversity and Intensity of International Technology Transfer and their Impacts on Organizational Performance

Authors: Seongryong Kang, Woonjin Kim, Sungjoo Lee

Abstract:

Under the environment of fierce competition and globalized economy, international technology collaboration has gained increasing attention as a way to improve innovation efficiency. While international technology transfer helps a firm to acquire necessary technology in a short period of time, it also has a risk; embedding external technology from overseas partners may cause a transaction cost due to the regional, cultural and language barriers, which tend to offset the benefits of such transfer. Though a number of previous studies have focused on the effects of technology in-transfer on firm performance, few have conducted in the context of international technology transfer. To fill this gap, this study aims to investigate the impact of international technology in-transfer on firm performance – both innovation and financial performance, with a particular emphasis on the diversity and intensity of such transfer. To do this, we adopted technology balance payment (TBP) data of Korean firms from 2010 to 2011, where an intermediate regression analysis was used to identify the intermediate effects of absorptive capacity. The analysis results indicate that i) the diversity and intensity of international technology transfer influence innovation performance by improving R&D capability positively; and ii) the diversity has a positive impact but the intensity has a negative impact on financial performance through the intermediation of R&D intensity. The research findings are expected to provide meaningful implications for establishing global technology strategy and developing policy programs to facilitate technology transfer.

Keywords: diversity, intensity, international technology acquisition, performance, technology transfer

Procedia PDF Downloads 349
24382 Effect of CSL Tube Type on the Drilled Shaft Axial Load Carrying Capacity

Authors: Ali Motevalli, Shahin Nayyeri Amiri

Abstract:

Cross-Hole Sonic Logging (CSL) is a common type of Non-Destructive Testing (NDT) method, which is currently used to check the integrity of placed drilled shafts. CSL evaluates the integrity of the concrete inside the cage and between the access tubes based on propagation of ultrasonic waves between two or more access tubes. A number of access tubes are installed inside the reinforcing cage prior to concrete placement as guides for sensors. The access tubes can be PVC or steel galvanized based on ASTM6760. The type of the CSL tubes can affect the axial strength of the drilled shaft. The objective of this study is to compare the amount of axial load capacity of drilled shafts due to using a different type of CSL tubes inside the caging. To achieve this, three (3) large-scale drilled shaft samples were built and tested using a hydraulic actuator at the Florida International University’s (FIU) Titan America Structures and Construction Testing (TASCT) laboratory. During the static load test, load-displacement curves were recorded by the data acquisition system (MegaDAC). Three drilled shaft samples were built to evaluate the effect of the type of the CSL tube on the axial load capacity in drilled shaft foundations.

Keywords: drilled shaft foundations, axial load capacity, cage, PVC, galvanized tube, CSL tube

Procedia PDF Downloads 394
24381 Passive Seismic in Hydrogeological Prospecting: The Case Study from Hard Rock and Alluvium Plain

Authors: Prarabdh Tiwari, M. Vidya Sagar, K. Bhima Raju, Joy Choudhury, Subash Chandra, E. Nagaiah, Shakeel Ahmed

Abstract:

Passive seismic, a wavefield interferometric imaging, low cost and rapid tool for subsurface investigation is used for various geotechnical purposes such as hydrocarbon exploration, seismic microzonation, etc. With the recent advancement, its application has also been extended to groundwater exploration by means of finding the bedrock depth. Council of Scientific & Industrial Research (CSIR)-National Geophysical Research Institute (NGRI) has experimented passive seismic studies along with electrical resistivity tomography for groundwater in hard rock (Choutuppal, Hyderabad). Passive Seismic with Electrical Resistivity (ERT) can give more clear 2-D subsurface image for Groundwater Exploration in Hard Rock area. Passive seismic data were collected using a Tromino, a three-component broadband seismometer, to measure background ambient noise and processed using GRILLA software. The passive seismic results are found corroborating with ERT (Electrical Resistivity Tomography) results. For data acquisition purpose, Tromino was kept over 30 locations consist recording of 20 minutes at each station. These location shows strong resonance frequency peak, suggesting good impedance contrast between different subsurface layers (ex. Mica rich Laminated layer, Weathered layer, granite, etc.) This paper presents signature of passive seismic for hard rock terrain. It has been found that passive seismic has potential application for formation characterization and can be used as an alternative tool for delineating litho-stratification in an urban condition where electrical and electromagnetic tools cannot be applied due to high cultural noise. In addition to its general application in combination with electrical and electromagnetic methods can improve the interpreted subsurface model.

Keywords: passive seismic, resonant frequency, Tromino, GRILLA

Procedia PDF Downloads 170
24380 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: text mining, topic extraction, independent, incremental, independent component analysis

Procedia PDF Downloads 291
24379 Open Data for e-Governance: Case Study of Bangladesh

Authors: Sami Kabir, Sadek Hossain Khoka

Abstract:

Open Government Data (OGD) refers to all data produced by government which are accessible in reusable way by common people with access to Internet and at free of cost. In line with “Digital Bangladesh” vision of Bangladesh government, the concept of open data has been gaining momentum in the country. Opening all government data in digital and customizable format from single platform can enhance e-governance which will make government more transparent to the people. This paper presents a well-in-progress case study on OGD portal by Bangladesh Government in order to link decentralized data. The initiative is intended to facilitate e-service towards citizens through this one-stop web portal. The paper further discusses ways of collecting data in digital format from relevant agencies with a view to making it publicly available through this single point of access. Further, possible layout of this web portal is presented.

Keywords: e-governance, one-stop web portal, open government data, reusable data, web of data

Procedia PDF Downloads 332
24378 The Significance of Translating Folklore in Teaching and Learning Open Distance e-Learning

Authors: M. A. Mabasa, O. Ramokolo, M. Z. Mnikathi, D. Mathabatha, T. Manyapelo

Abstract:

The study examines the importance of translating South African folklore from Oral into Written Literature in a Multilingual Education. Therefore, the study postulates that translation can be regarded as a valuable tool when oral and written literature is transmitted from one generation to another. The study entails that translation does not take place in a haphazard fashion; for that reason, skills such as translation principles are required to translate folklore significantly and effectively. The purpose of the study is to indicate the significance of using translation relating to folklore in teaching and learning. The study also observed that Modernism in literature should be shared amongst varieties of cultures because folklore is interactive in narrating stories, folktales and myths to sharpen the reader’s knowledge and intellect because they are informative and educative in nature. As a technological tool, the study points out that translation is of paramount importance in the sense that the meanings of different data can be made available in all South African official languages using oral and written forms of folklore. The study opines that tradition and customary beliefs and practices in the institution of higher learning. The study envisages the way in which literature of folklore can be juxtaposed to ensure that translated folklore is of quality assured standards. The study alludes that well-translated folklore can serve as oral and written literature, which may contribute to the child’s learning and acquisition of knowledge and insights during cognitive development toward maturity. Methodologically, the study selects a qualitative research approach and selects content analysis as an instrument for data gathering, which will be analyzed qualitatively in consideration of the significance of translating folklore as written and spoken literature in a documented way. The study reveals that the translation of folktales promotes functional multilingualism in high-function formal contexts like a university. The study emphasizes that translated and preserved literary folklore may serve as a language repository from one generation to another because of the archival and storage of information in the form of a term bank.

Keywords: translation, editing, teaching, learning, folklores

Procedia PDF Downloads 6
24377 Application of Compressed Sensing and Different Sampling Trajectories for Data Reduction of Small Animal Magnetic Resonance Image

Authors: Matheus Madureira Matos, Alexandre Rodrigues Farias

Abstract:

Magnetic Resonance Imaging (MRI) is a vital imaging technique used in both clinical and pre-clinical areas to obtain detailed anatomical and functional information. However, MRI scans can be expensive, time-consuming, and often require the use of anesthetics to keep animals still during the imaging process. Anesthetics are commonly administered to animals undergoing MRI scans to ensure they remain still during the imaging process. However, prolonged or repeated exposure to anesthetics can have adverse effects on animals, including physiological alterations and potential toxicity. Minimizing the duration and frequency of anesthesia is, therefore, crucial for the well-being of research animals. In recent years, various sampling trajectories have been investigated to reduce the number of MRI measurements leading to shorter scanning time and minimizing the duration of animal exposure to the effects of anesthetics. Compressed sensing (CS) and sampling trajectories, such as cartesian, spiral, and radial, have emerged as powerful tools to reduce MRI data while preserving diagnostic quality. This work aims to apply CS and cartesian, spiral, and radial sampling trajectories for the reconstruction of MRI of the abdomen of mice sub-sampled at levels below that defined by the Nyquist theorem. The methodology of this work consists of using a fully sampled reference MRI of a female model C57B1/6 mouse acquired experimentally in a 4.7 Tesla MRI scanner for small animals using Spin Echo pulse sequences. The image is down-sampled by cartesian, radial, and spiral sampling paths and then reconstructed by CS. The quality of the reconstructed images is objectively assessed by three quality assessment techniques RMSE (Root mean square error), PSNR (Peak to Signal Noise Ratio), and SSIM (Structural similarity index measure). The utilization of optimized sampling trajectories and CS technique has demonstrated the potential for a significant reduction of up to 70% of image data acquisition. This result translates into shorter scan times, minimizing the duration and frequency of anesthesia administration and reducing the potential risks associated with it.

Keywords: compressed sensing, magnetic resonance, sampling trajectories, small animals

Procedia PDF Downloads 54
24376 Resource Framework Descriptors for Interestingness in Data

Authors: C. B. Abhilash, Kavi Mahesh

Abstract:

Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.

Keywords: RDF, interestingness, knowledge base, semantic data

Procedia PDF Downloads 142
24375 Handy EKG: Low-Cost ECG For Primary Care Screening In Developing Countries

Authors: Jhiamluka Zservando Solano Velasquez, Raul Palma, Alejandro Calderon, Servio Paguada, Erick Marin, Kellyn Funes, Hana Sandoval, Oscar Hernandez

Abstract:

Background: Screening cardiac conditions in primary care in developing countries can be challenging, and Honduras is not the exception. One of the main limitations is the underfunding of the Healthcare System in general, causing conventional ECG acquisition to become a secondary priority. Objective: Development of a low-cost ECG to improve screening of arrhythmias in primary care and communication with a specialist in secondary and tertiary care. Methods: Design a portable, pocket-size low-cost 3 lead ECG (Handy EKG). The device is autonomous and has Wi-Fi/Bluetooth connectivity options. A mobile app was designed which can access online servers with machine learning, a subset of artificial intelligence to learn from the data and aid clinicians in their interpretation of readings. Additionally, the device would use the online servers to transfer patient’s data and readings to a specialist in secondary and tertiary care. 50 randomized patients volunteer to participate to test the device. The patients had no previous cardiac-related conditions, and readings were taken. One reading was performed with the conventional ECG and 3 readings with the Handy EKG using different lead positions. This project was possible thanks to the funding provided by the National Autonomous University of Honduras. Results: Preliminary results show that the Handy EKG performs readings of the cardiac activity similar to those of a conventional electrocardiograph in lead I, II, and III depending on the position of the leads at a lower cost. The wave and segment duration, amplitude, and morphology of the readings were similar to the conventional ECG, and interpretation was possible to conclude whether there was an arrhythmia or not. Two cases of prolonged PR segment were found in both ECG device readings. Conclusion: Using a Frugal innovation approach can allow lower income countries to develop innovative medical devices such as the Handy EKG to fulfill unmet needs at lower prices without compromising effectiveness, safety, and quality. The Handy EKG provides a solution for primary care screening at a much lower cost and allows for convenient storage of the readings in online servers where clinical data of patients can then be accessed remotely by Cardiology specialists.

Keywords: low-cost hardware, portable electrocardiograph, prototype, remote healthcare

Procedia PDF Downloads 165
24374 PLC Based Automatic Railway Crossing System for India

Authors: Tapan Upadhyay, Aqib Siddiqui, Sameer Khan

Abstract:

Railway crossing system in India is a manually operated level crossing system, either manned or unmanned. The main aim is to protect pedestrians and vehicles from colliding with trains, which pass at regular intervals, as India has the largest and busiest railway network. But because of human error and negligence, every year thousands of lives are lost due to accidents at railway crossings. To avoid this, we suggest a solution, by using Programmable Logical Controller (PLC) based automatic system, which will automatically control the barrier as well as roadblocks to stop people from crossing while security warning is given. Often people avoid security warning, and pass two-wheelers from beneath the barrier, while the train is at a distance away. This paper aims at reducing the fatality and accident rate by controlling barrier and roadblocks using sensors which sense the incoming train and vehicles and sends a signal to PLC. The PLC in return sends a signal to barrier and roadblocks. Once the train passes, the barrier and roadblocks retrieve back, and the passage is clear for vehicles and pedestrians to cross. PLC’s are used because they are very flexible, cost effective, space efficient, reduces complexity and minimises errors. Supervisory Control And Data Acquisition (SCADA) is used to monitor the functioning.

Keywords: level crossing, PLC, sensors, SCADA

Procedia PDF Downloads 408
24373 Data Mining Practices: Practical Studies on the Telecommunication Companies in Jordan

Authors: Dina Ahmad Alkhodary

Abstract:

This study aimed to investigate the practices of Data Mining on the telecommunication companies in Jordan, from the viewpoint of the respondents. In order to achieve the goal of the study, and test the validity of hypotheses, the researcher has designed a questionnaire to collect data from managers and staff members from main department in the researched companies. The results shows improvements stages of the telecommunications companies towered Data Mining.

Keywords: data, mining, development, business

Procedia PDF Downloads 476
24372 The Impact of System and Data Quality on Organizational Success in the Kingdom of Bahrain

Authors: Amal M. Alrayes

Abstract:

Data and system quality play a central role in organizational success, and the quality of any existing information system has a major influence on the effectiveness of overall system performance.Given the importance of system and data quality to an organization, it is relevant to highlight their importance on organizational performance in the Kingdom of Bahrain. This research aims to discover whether system quality and data quality are related, and to study the impact of system and data quality on organizational success. A theoretical model based on previous research is used to show the relationship between data and system quality, and organizational impact. We hypothesize, first, that system quality is positively associated with organizational impact, secondly that system quality is positively associated with data quality, and finally that data quality is positively associated with organizational impact. A questionnaire was conducted among public and private organizations in the Kingdom of Bahrain. The results show that there is a strong association between data and system quality, that affects organizational success.

Keywords: data quality, performance, system quality, Kingdom of Bahrain

Procedia PDF Downloads 474
24371 Cloud Computing in Data Mining: A Technical Survey

Authors: Ghaemi Reza, Abdollahi Hamid, Dashti Elham

Abstract:

Cloud computing poses a diversity of challenges in data mining operation arising out of the dynamic structure of data distribution as against the use of typical database scenarios in conventional architecture. Due to immense number of users seeking data on daily basis, there is a serious security concerns to cloud providers as well as data providers who put their data on the cloud computing environment. Big data analytics use compute intensive data mining algorithms (Hidden markov, MapReduce parallel programming, Mahot Project, Hadoop distributed file system, K-Means and KMediod, Apriori) that require efficient high performance processors to produce timely results. Data mining algorithms to solve or optimize the model parameters. The challenges that operation has to encounter is the successful transactions to be established with the existing virtual machine environment and the databases to be kept under the control. Several factors have led to the distributed data mining from normal or centralized mining. The approach is as a SaaS which uses multi-agent systems for implementing the different tasks of system. There are still some problems of data mining based on cloud computing, including design and selection of data mining algorithms.

Keywords: cloud computing, data mining, computing models, cloud services

Procedia PDF Downloads 459
24370 Effects of Cerium Oxide Nanoparticle Addition in Diesel and Diesel-Biodiesel Blends on the Performance Characteristics of a CI Engine

Authors: Abbas Ali Taghipoor Bafghi, Hosein Bakhoda, Fateme Khodaei Chegeni

Abstract:

An experimental investigation is carried out to establish the performance characteristics of a compression ignition engine while using cerium oxide nano particles as additive in neat diesel and diesel-bio diesel blends. In the first phase of the experiments, stability of neat diesel and diesel-bio diesel fuel blends with the addition of cerium oxide nano particles are analyzed. After series of experiments, it is found that the blends subjected to high speed blending followed by ultrasonic bath stabilization improves the stability.In the second phase, performance characteristics are studied using the stable fuel blends in a single cylinder four stroke engine coupled with an electrical dynamo meter and a data acquisition system. The cerium oxide acts as an oxygen donating catalyst and provides oxygen for combustion. The activation energy of cerium oxide acts to burn off carbon deposits within the engine cylinder at the wall temperature and prevents the deposition of non-polar compounds on the cylinder wall results reduction in HC emissions. The tests revealed that cerium oxide nano particles can be used as additive in diesel and diesel-bio diesel blends to improve complete combustion of the fuel significantly.

Keywords: engine, cerium oxide, biodiesel, deposit

Procedia PDF Downloads 323