Search results for: remote sensing data
25114 Evidence-Based in Telemonitoring of Users with Pacemakers at Five Years after Implant: The Poniente Study
Authors: Antonio Lopez-Villegas, Daniel Catalan-Matamoros, Remedios Lopez-Liria
Abstract:
Objectives: The purpose of this study was to analyze clinical data, health-related quality of life (HRQoL) and functional capacity of patients using a telemonitoring follow-up system (TM) compared to patients followed-up through standard outpatient visits (HM) 5 years after the implantation of a pacemaker. Methods: This is a controlled, non-randomised, nonblinded clinical trial, with data collection carried out at 5 years after the pacemakers implant. The study was developed at Hospital de Poniente (Almeria, Spain), between October 2012 and November 2013. The same clinical outcomes were analyzed in both follow-up groups. Health-Related Quality of Life and Functional Capacity was assessed through EuroQol-5D (EQ-5D) questionnaire and Duke Activity Status Index (DASI) respectively. Sociodemographic characteristics and clinical data were also analyzed. Results: 5 years after pacemaker implant, 55 of 82 initial patients finished the study. Users with pacemakers were assigned to either a conventional follow-up group at hospital (HM=34, 50 initials) or a telemonitoring system group (TM=21, 32 initials). No significant differences were found between both groups according to sociodemographic characteristics, clinical data, Health-Related Quality of Life and Functional Capacity according to medical record and EQ5D and DASI questionnaires. In addition, conventional follow-up visits to hospital were reduced in 44,84% (p < 0,001) in the telemonitoring group in relation to hospital monitoring group. Conclusion: Results obtained in this study suggest that the telemonitoring of users with pacemakers is an equivalent option to conventional follow-up at hospital, in terms of Health-Related Quality of Life and Functional Capacity. Furthermore, it allows for the early detection of cardiovascular and pacemakers-related problem events and significantly reduces the number of in-hospital visits. Trial registration: ClinicalTrials.gov NCT02234245. The PONIENTE study has been funded by the General Secretariat for Research, Development and Innovation, Regional Government of Andalusia (Spain), project reference number PI/0256/2017, under the research call 'Development and Innovation Projects in the Field of Biomedicine and Health Sciences', 2017.Keywords: cardiovascular diseases, health-related quality of life, pacemakers follow-up, remote monitoring, telemedicine
Procedia PDF Downloads 12925113 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques
Authors: Tosin Ige
Abstract:
Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique
Procedia PDF Downloads 17425112 Big Data: Concepts, Technologies and Applications in the Public Sector
Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora
Abstract:
Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.Keywords: big data, big data analytics, Hadoop, cloud
Procedia PDF Downloads 31225111 The Follower Robots Tested in Different Lighting Condition and Improved Capabilities
Authors: Sultan Muhammed Fatih Apaydin
Abstract:
In this study, two types of robot were examined as being pioneer robot and follower robot for improving of the capabilities of tracking robots. Robots continue to tracking each other and measurement of the follow-up distance between them is very important for improvements to be applied. It was achieved that the follower robot follows the pioneer robot in line with intended goals. The tests were applied to the robots in various grounds and environments in point of performance and necessary improvements were implemented by measuring the results of these tests.Keywords: mobile robot, remote and autonomous control, infra-red sensors, arduino
Procedia PDF Downloads 56625110 Semantic Data Schema Recognition
Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia
Abstract:
The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns
Procedia PDF Downloads 41825109 Design and Implementation of Generative Models for Odor Classification Using Electronic Nose
Authors: Kumar Shashvat, Amol P. Bhondekar
Abstract:
In the midst of the five senses, odor is the most reminiscent and least understood. Odor testing has been mysterious and odor data fabled to most practitioners. The delinquent of recognition and classification of odor is important to achieve. The facility to smell and predict whether the artifact is of further use or it has become undesirable for consumption; the imitation of this problem hooked on a model is of consideration. The general industrial standard for this classification is color based anyhow; odor can be improved classifier than color based classification and if incorporated in machine will be awfully constructive. For cataloging of odor for peas, trees and cashews various discriminative approaches have been used Discriminative approaches offer good prognostic performance and have been widely used in many applications but are incapable to make effectual use of the unlabeled information. In such scenarios, generative approaches have better applicability, as they are able to knob glitches, such as in set-ups where variability in the series of possible input vectors is enormous. Generative models are integrated in machine learning for either modeling data directly or as a transitional step to form an indeterminate probability density function. The algorithms or models Linear Discriminant Analysis and Naive Bayes Classifier have been used for classification of the odor of cashews. Linear Discriminant Analysis is a method used in data classification, pattern recognition, and machine learning to discover a linear combination of features that typifies or divides two or more classes of objects or procedures. The Naive Bayes algorithm is a classification approach base on Bayes rule and a set of qualified independence theory. Naive Bayes classifiers are highly scalable, requiring a number of restraints linear in the number of variables (features/predictors) in a learning predicament. The main recompenses of using the generative models are generally a Generative Models make stronger assumptions about the data, specifically, about the distribution of predictors given the response variables. The Electronic instrument which is used for artificial odor sensing and classification is an electronic nose. This device is designed to imitate the anthropological sense of odor by providing an analysis of individual chemicals or chemical mixtures. The experimental results have been evaluated in the form of the performance measures i.e. are accuracy, precision and recall. The investigational results have proven that the overall performance of the Linear Discriminant Analysis was better in assessment to the Naive Bayes Classifier on cashew dataset.Keywords: odor classification, generative models, naive bayes, linear discriminant analysis
Procedia PDF Downloads 39025108 Jagiellonian-PET: A Novel TOF-PET Detector Based on Plastic Scintillators
Authors: P. Moskal, T. Bednarski, P. Bialas, E. Czerwinski, A. Gajos, A. Gruntowski, D. Kaminska, L. Kaplon, G. Korcyl, P. Kowalski, T. Kozik, W. Krzemien, E. Kubicz, Sz. Niedzwiecki, M. Palka, L. Raczynski, Z. Rudy, P. Salabura, N. G. Sharma, M. Silarski, A. Slomski, J. Smyrski, A. Strzelecki, A. Wieczorek, W. Wislicki, M. Zielinski, N. Zon
Abstract:
A new concept and results of the performance tests of the TOF-PET detection system developed at the Jagiellonian University will be presented. The novelty of the concept lies in employing long strips of polymer scintillators instead of crystals as detectors of annihilation quanta, and in using predominantly the timing of signals instead of their amplitudes for the reconstruction of Lines-of-Response. The diagnostic chamber consists of plastic scintillator strips readout by pairs of photo multipliers arranged axially around a cylindrical surface. To take advantage of the superior timing properties of plastic scintillators the signals are probed in the voltage domain with the accuracy of 20 ps by a newly developed electronics, and the data are collected by the novel trigger-less and reconfigurable data acquisition system. The hit-position and hit-time are reconstructed by the dedicated reconstruction methods based on the compressing sensing theory and the library of synchronized model signals. The solutions are subject to twelve patent applications. So far a time-of-flight resolution of ~120 ps (sigma) was achieved for a double-strip prototype with 30 cm field-of-view (FOV). It is by more than a factor of two better than TOF resolution achievable in current TOF-PET modalities and at the same time the FOV of 30 cm long prototype is significantly larger with respect to typical commercial PET devices. The Jagiellonian PET (J-PET) detector with plastic scintillators arranged axially possesses also another advantage. Its diagnostic chamber is free of any electronic devices and magnetic materials thus giving unique possibilities of combining J-PET with CT and J-PET with MRI for scanning the same part of a patient at the same time with both methods.Keywords: PET-CT, PET-MRI, TOF-PET, scintillator
Procedia PDF Downloads 49825107 Access Control System for Big Data Application
Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud
Abstract:
Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.Keywords: access control, security, Big Data, domain
Procedia PDF Downloads 13625106 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 19725105 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output
Procedia PDF Downloads 6225104 Information and Communication Technology Learning between Parents and High School Students
Authors: Yu-Mei Tseng, Chih-Chun Wu
Abstract:
As information and communication technology (ICT) has become a part of people’s lives, most teenagers born after the 1980s and grew up in internet generation are called digital natives. Meanwhile, those teenagers’ parents are called digital immigrants. They need to keep learning new skills of ICT. This study investigated that high school students helped their parents set up social network services (SNS) and taught them how to use ICT. This study applied paper and pencil anonymous questionnaires that asked the ICT learning and ICT products using in high school students’ parents. The sample size was 2,621 high school students, including 1,360 (51.9%) males and 1,261 (48.1%) females. The sample was from 12 high school and vocational high school in central Taiwan. Results from paired sample t-tests demonstrated regardless genders, both male and female high school students help mothers set up Facebook and LINE more often than fathers. In addition, both male and female high school students taught mothers to use ICT more often than fathers. Meanwhile, both male and female high school students teach mothers to use SNS more often than fathers. The results showed that intergenerational ICT teaching occurred more often between mothers and her children than fathers. It could imply that mothers play a more important role in family ICT learning than fathers, or it could be that mothers need more help regarding ICT than fathers. As for gender differences, results from the independent t-tests showed that female high school students were more likely than male ones to help their parents setup Facebook and LINE. In addition, compared to male high school students, female ones were more likely to teach their parents to use smartphone, Facebook and LINE. However, no gender differences were detected in teaching mothers. The gender differences results suggested that female teenagers offer more helps to their parents regarding ICT learning than their male counterparts. As for area differences, results from the independent t-tests showed that the high school in remote area students were more likely than metropolitan ones to teach parents to use computer, search engine and download files of audio and video. The area differences results might indicate that remote area students were more likely to teach their parents how to use ICT. The results from this study encourage children to help and teach their parents with ICT products.Keywords: adult ICT learning, family ICT learning, ICT learning, urban-rural gap
Procedia PDF Downloads 17825103 Challenges and Opportunities for Facilitating Telemedicine Services Through Information and Communication Technologies (ICT) in Ethiopia
Authors: Wegene Demeke
Abstract:
Background: The demand for healthcare services is growing in developing and developed countries. Information and communication technology is used to facilitate healthcare services. In the case of developing countries, implementing telemedicine is aimed at providing healthcare for people living in remote areas where health service is not accessible. The implementations of telemedicine in developing countries are unsuccessful. For example, the recent study indicates that 90% of telemedicine projects are abandoned or failed in developing countries. Several researchers reported the technological challenges as the main factor for the non-adoption of telemedicine. However, this research reports the health professionals’ perspectives arising from technical, social and organizational factors that are considered as key elements for the setting and running of telemedicine in Ethiopia. The importance and significance of telemedicine for healthcare is growing. For example, the use of telemedicine in the current pandemic situation becomes an essential strategic element in providing healthcare services in developed countries. Method: Qualitative and quantitative exploratory research methods used to collect data to find factors affecting the adoption of Information and communication technologies for telemedicine use. The survey was distributed using emails and Google forms. The email addresses were collected from personal contact and publicly available websites in Ethiopia. The thematic analysis used to build the barriers and facilitators factors for establishing telemedicine services. A survey questionnaire with open-and-close questions was used to collect data from 175 health professionals. Outcome: The result of this research will contribute to building the key barriers and facilitators factors of telemedicine from the health professional perspectives in developing countries. The thematic analysis provides barriers and facilitators factors arising from technical, organizational, and social sources.Keywords: telemedicine, ICT, developing country, Ethiopia, health service
Procedia PDF Downloads 11025102 Two-Photon Fluorescence in N-Doped Graphene Quantum Dots
Authors: Chi Man Luk, Ming Kiu Tsang, Chi Fan Chan, Shu Ping Lau
Abstract:
Nitrogen-doped graphene quantum dots (N-GQDs) were fabricated by microwave-assisted hydrothermal technique. The optical properties of the N-GQDs were studied. The luminescence of the N-GQDs can be tuned by varying the excitation wavelength. Furthermore, two-photon luminescence of the N-GQDs excited by near-infrared laser can be obtained. It is shown that N-doping play a key role on two-photon luminescence. The N-GQDs are expected to find application in biological applications including bioimaging and sensing.Keywords: graphene quantum dots, nitrogen doping, photoluminescence, two-photon fluorescence
Procedia PDF Downloads 63425101 Key Transfer Protocol Based on Non-invertible Numbers
Authors: Luis A. Lizama-Perez, Manuel J. Linares, Mauricio Lopez
Abstract:
We introduce a method to perform remote user authentication on what we call non-invertible cryptography. It exploits the fact that the multiplication of an invertible integer and a non-invertible integer in a ring Zn produces a non-invertible integer making infeasible to compute factorization. The protocol requires the smallest key size when is compared with the main public key algorithms as Diffie-Hellman, Rivest-Shamir-Adleman or Elliptic Curve Cryptography. Since we found that the unique opportunity for the eavesdropper is to mount an exhaustive search on the keys, the protocol seems to be post-quantum.Keywords: invertible, non-invertible, ring, key transfer
Procedia PDF Downloads 18125100 Fluorescence Gold Nanoparticles: Sensing Properties and Cytotoxicity Studies in MCF-7 Human Breast Cancer Cells
Authors: Cristina Núñez, Rufina Bastida, Elena Labisbal, Alejandro Macías, María T. Pereira, José M. Vila
Abstract:
A highly selective quinoline-based fluorescent sensor L was designed in order to functionalize gold nanoparticles (GNPs@L). The cytotoxicity of compound L and GNPs@L on the MCF-7 breast cancer cells was explored and it was observed that L and GNPs@L compounds induced apoptosis in MCF-7 cancer cells. The cellular uptake of the hybrid system GNPs@L was studied using confocal laser scanning microscopy (CLSM).Keywords: cytotoxicity, fluorescent probes, nanoparticles, quinoline
Procedia PDF Downloads 38625099 A Portable Cognitive Tool for Engagement Level and Activity Identification
Authors: Terry Teo, Sun Woh Lye, Yufei Li, Zainuddin Zakaria
Abstract:
Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that when using the channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available BCI 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.Keywords: assessment, neurophysiology, monitoring, EEG
Procedia PDF Downloads 7725098 Development of a Sprayable Piezoelectric Material for E-Textile Applications
Authors: K. Yang, Y. Wei, M. Zhang, S. Yong, R. Torah, J. Tudor, S. Beeby
Abstract:
E-textiles are traditional textiles with integrated electronic functionality. It is an emerging innovation with numerous applications in fashion, wearable computing, health and safety monitoring, and the military and medical sectors. The piezoelectric effect is a widespread and versatile transduction mechanism used in sensor and actuator applications. Piezoelectric materials produce electric charge when stressed. Conversely, mechanical deformation occurs when an electric field is applied across the material. Lead Zirconate Titanate (PZT) is a widely used piezoceramic material which has been used to fabricate e-textiles through screen printing, electro spinning and hydrothermal synthesis. This paper explores an alternative fabrication process: Spray coating. Spray coating is a straightforward and cost effective fabrication method applicable on both flat and curved surfaces. It can also be applied selectively by spraying through a stencil which enables the required design to be realised on the substrate. This work developed a sprayable PZT based piezoelectric ink consisting of a binder (Fabink-Binder-01), PZT powder (80 % 2 µm and 20 % 0.8 µm) and acetone as a thinner. The optimised weight ratio of PZT/binder is 10:1. The components were mixed using a SpeedMixer DAC 150. The fabrication processes is as follows: 1) Screen print a UV-curable polyurethane interface layer on the textile to create a smooth textile surface. 2) Spray one layer of a conductive silver polymer ink through a pre-designed stencil and dry at 90 °C for 10 minutes to form the bottom electrode. 3) Spray three layers of the PZT ink through a pre-designed stencil and dry at 90 °C for 10 minutes for each layer to form a total thickness of ~250µm PZT layer. 4) Spray one layer of the silver ink through a pre-designed stencil on top of the PZT layer and dry at 90 °C for 10 minutes to form the top electrode. The domains of the PZT elements were aligned by polarising the material at an elevated temperature under a strong electric field. A d33 of 37 pC/N has been achieved after polarising at 90 °C for 6 minutes with an electric field of 3 MV/m. The application of the piezoelectric textile was demonstrated by fabricating a pressure sensor to switch an LED on/off. Other potential applications on e-textiles include motion sensing, energy harvesting, force sensing and a buzzer.Keywords: piezoelectric, PZT, spray coating, pressure sensor, e-textile
Procedia PDF Downloads 46725097 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8425096 Protecting the Cloud Computing Data Through the Data Backups
Authors: Abdullah Alsaeed
Abstract:
Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.
Procedia PDF Downloads 8925095 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area
Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim
Abstract:
In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.Keywords: data estimation, link data, machine learning, road network
Procedia PDF Downloads 51125094 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 43525093 Health Monitoring of Composite Pile Construction Using Fiber Bragg Gratings Sensor Arrays
Authors: B. Atli-Veltin, A. Vosteen, D. Megan, A. Jedynska, L. K. Cheng
Abstract:
Composite materials combine the advantages of being lightweight and possessing high strength. This is in particular of interest for the development of large constructions, e.g., aircraft, space applications, wind turbines, etc. One of the shortcomings of using composite materials is the complex nature of the failure mechanisms which makes it difficult to predict the remaining lifetime. Therefore, condition and health monitoring are essential for using composite material for critical parts of a construction. Different types of sensors are used/developed to monitor composite structures. These include ultrasonic, thermography, shearography and fiber optic. The first 3 technologies are complex and mostly used for measurement in laboratory or during maintenance of the construction. Optical fiber sensor can be surface mounted or embedded in the composite construction to provide the unique advantage of in-operation measurement of mechanical strain and other parameters of interest. This is identified to be a promising technology for Structural Health Monitoring (SHM) or Prognostic Health Monitoring (PHM) of composite constructions. Among the different fiber optic sensing technologies, Fiber Bragg Grating (FBG) sensor is the most mature and widely used. FBG sensors can be realized in an array configuration with many FBGs in a single optical fiber. In the current project, different aspects of using embedded FBG for composite wind turbine monitoring are investigated. The activities are divided into two parts. Firstly, FBG embedded carbon composite laminate is subjected to tensile and bending loading to investigate the response of FBG which are placed in different orientations with respect to the fiber. Secondly, the demonstration of using FBG sensor array for temperature and strain sensing and monitoring of a 5 m long scale model of a glass fiber mono-pile is investigated. Two different FBG types are used; special in-house fibers and off-the-shelf ones. The results from the first part of the study are showing that the FBG sensors survive the conditions during the production of the laminate. The test results from the tensile and the bending experiments are indicating that the sensors successfully response to the change of strain. The measurements from the sensors will be correlated with the strain gauges that are placed on the surface of the laminates.Keywords: Fiber Bragg Gratings, embedded sensors, health monitoring, wind turbine towers
Procedia PDF Downloads 24425092 An Open-Source Guidance System for an Autonomous Planter Robot in Precision Agriculture
Authors: Nardjes Hamini, Mohamed Bachir Yagoubi
Abstract:
Precision agriculture has revolutionized farming by enabling farmers to monitor their crops remotely in real-time. By utilizing technologies such as sensors, farmers can detect the state of growth, hydration levels, and nutritional status and even identify diseases affecting their crops. With this information, farmers can make informed decisions regarding irrigation, fertilization, and pesticide application. Automated agricultural tasks, such as plowing, seeding, planting, and harvesting, are carried out by autonomous robots and have helped reduce costs and increase production. Despite the advantages of precision agriculture, its high cost makes it inaccessible to small and medium-sized farms. To address this issue, this paper presents an open-source guidance system for an autonomous planter robot. The system is composed of a Raspberry Pi-type nanocomputer equipped with Wi-Fi, a GPS module, a gyroscope, and a power supply module. The accompanying application allows users to enter and calibrate maps with at least four coordinates, enabling the localized contour of the parcel to be captured. The application comprises several modules, such as the mission entry module, which traces the planting trajectory and points, and the action plan entry module, which creates an ordered list of pre-established tasks such as loading, following the plan, returning to the garage, and entering sleep mode. A remote control module enables users to control the robot manually, visualize its location on the map, and use a real-time camera. Wi-Fi coverage is provided by an outdoor access point, covering a 2km circle. This open-source system offers a low-cost alternative for small and medium-sized farms, enabling them to benefit from the advantages of precision agriculture.Keywords: autonomous robot, guidance system, low-cost, medium farms, open-source system, planter robot, precision agriculture, real-time monitoring, remote control, small farms
Procedia PDF Downloads 11125091 Real Time Monitoring and Control of Proton Exchange Membrane Fuel Cell in Cognitive Radio Environment
Authors: Prakash Thapa, Gye Choon Park, Sung Gi Kwon, Jin Lee
Abstract:
The generation of electric power from a proton exchange membrane (PEM) fuel cell is influenced by temperature, pressure, humidity, flow rate of reactant gaseous and partial flooding of membrane electrode assembly (MEA). Among these factors, temperature and cathode flooding are the most affecting parameters on the performance of fuel cell. This paper describes the detail design and effect of these parameters on PEM fuel cell. Performance of all parameters was monitored, analyzed and controlled by using 5KWatt PEM fuel cell. In the real-time data communication for remote monitoring and control of PEM fuel cell, a normalized least mean square algorithm in cognitive radio environment is used. By the use of this method, probability of energy signal detection will be maximum which solved the frequency shortage problem. So the monitoring system hanging out and slow speed problem will be solved. Also from the control unit, all parameters are controlled as per the system requirement. As a result, PEM fuel cell generates maximum electricity with better performance.Keywords: proton exchange membrane (PEM) fuel cell, pressure, temperature and humidity sensor (PTH), efficiency curve, cognitive radio network (CRN)
Procedia PDF Downloads 46025090 Signal Estimation and Closed Loop System Performance in Atrial Fibrillation Monitoring with Communication Channels
Authors: Mohammad Obeidat, Ayman Mansour
Abstract:
In this paper a unique issue rising from feedback control of Atrial Fibrillation monitoring system with embedded communication channels has been investigated. One of the important factors to measure the performance of the feedback control closed loop system is disturbance and noise attenuation factor. It is important that the feedback system can attenuate such disturbances on the atrial fibrillation heart rate signals. Communication channels depend on network traffic conditions and deliver different throughput, implying that the sampling intervals may change. Since signal estimation is updated on the arrival of new data, its dynamics actually change with the sampling interval. Consequently, interaction among sampling, signal estimation, and the controller will introduce new issues in remotely controlled Atrial Fibrillation system. This paper treats a remotely controlled atrial fibrillation system with one communication channel which connects between the heart rate and rhythm measurements to the remote controller. Typical and optimal signal estimation schemes is represented by a signal averaging filter with its time constant derived from the step size of the signal estimation algorithm.Keywords: atrial fibrillation, communication channels, closed loop, estimation
Procedia PDF Downloads 38025089 Optics Meets Microfluidics for Highly Sensitive Force Sensing
Authors: Iliya Dimitrov Stoev, Benjamin Seelbinder, Elena Erben, Nicola Maghelli, Moritz Kreysing
Abstract:
Despite the revolutionizing impact of optical tweezers in materials science and cell biology up to the present date, trapping has so far extensively relied on specific material properties of the probe and local heating has limited applications related to investigating dynamic processes within living systems. To overcome these limitations while maintaining high sensitivity, here we present a new optofluidic approach that can be used to gently trap microscopic particles and measure femtoNewton forces in a contact-free manner and with thermally limited precision.Keywords: optofluidics, force measurements, microrheology, FLUCS, thermoviscous flows
Procedia PDF Downloads 17225088 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37225087 Gauging Floral Resources for Pollinators Using High Resolution Drone Imagery
Authors: Nicholas Anderson, Steven Petersen, Tom Bates, Val Anderson
Abstract:
Under the multiple-use management regime established in the United States for federally owned lands, government agencies have come under pressure from commercial apiaries to grant permits for the summer pasturing of honeybees on government lands. Federal agencies have struggled to integrate honeybees into their management plans and have little information to make regulations that resolve how many colonies should be allowed in a single location and at what distance sets of hives should be placed. Many conservation groups have voiced their concerns regarding the introduction of honeybees to these natural lands, as they may outcompete and displace native pollinating species. Assessing the quality of an area in regard to its floral resources, pollen, and nectar can be important when attempting to create regulations for the integration of commercial honeybee operations into a native ecosystem. Areas with greater floral resources may be able to support larger numbers of honeybee colonies, while poorer resource areas may be less resilient to introduced disturbances. Attempts are made in this study to determine flower cover using high resolution drone imagery to help assess the floral resource availability to pollinators in high elevation, tall forb communities. This knowledge will help in determining the potential that different areas may have for honeybee pasturing and honey production. Roughly 700 images were captured at 23m above ground level using a drone equipped with a Sony QX1 RGB 20-megapixel camera. These images were stitched together using Pix4D, resulting in a 60m diameter high-resolution mosaic of a tall forb meadow. Using the program ENVI, a supervised maximum likelihood classification was conducted to calculate the percentage of total flower cover and flower cover by color (blue, white, and yellow). A complete vegetation inventory was taken on site, and the major flowers contributing to each color class were noted. An accuracy assessment was performed on the classification yielding an 89% overall accuracy and a Kappa Statistic of 0.855. With this level of accuracy, drones provide an affordable and time efficient method for the assessment of floral cover in large areas. The proximal step of this project will now be to determine the average pollen and nectar loads carried by each flower species. The addition of this knowledge will result in a quantifiable method of measuring pollen and nectar resources of entire landscapes. This information will not only help land managers determine stocking rates for honeybees on public lands but also has applications in the agricultural setting, aiding producers in the determination of the number of honeybee colonies necessary for proper pollination of fruit and nut crops.Keywords: honeybee, flower, pollinator, remote sensing
Procedia PDF Downloads 14325086 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 11125085 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 313