Search results for: time-lapse imaging data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26012

Search results for: time-lapse imaging data

23882 Developing Logistics Indices for Turkey as an an Indicator of Economic Activity

Authors: Gizem İntepe, Eti Mizrahi

Abstract:

Investment and financing decisions are influenced by various economic features. Detailed analysis should be conducted in order to make decisions not only by companies but also by governments. Such analysis can be conducted either at the company level or on a sectoral basis to reduce risks and to maximize profits. Sectoral disaggregation caused by seasonality effects, subventions, data advantages or disadvantages may appear in sectors behaving parallel to BIST (Borsa Istanbul stock exchange) Index. Proposed logistic indices could serve market needs as a decision parameter in sectoral basis and also helps forecasting activities in import export volume changes. Also it is an indicator of logistic activity, which is also a sign of economic mobility at the national level. Publicly available data from “Ministry of Transport, Maritime Affairs and Communications” and “Turkish Statistical Institute” is utilized to obtain five logistics indices namely as; exLogistic, imLogistic, fLogistic, dLogistic and cLogistic index. Then, efficiency and reliability of these indices are tested.

Keywords: economic activity, export trade data, import trade data, logistics indices

Procedia PDF Downloads 335
23881 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics

Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane

Abstract:

Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.

Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing

Procedia PDF Downloads 420
23880 Estimation of Coefficient of Discharge of Side Trapezoidal Labyrinth Weir Using Group Method of Data Handling Technique

Authors: M. A. Ansari, A. Hussain, A. Uddin

Abstract:

A side weir is a flow diversion structure provided in the side wall of a channel to divert water from the main channel to a branch channel. The trapezoidal labyrinth weir is a special type of weir in which crest length of the weir is increased to pass higher discharge. Experimental and numerical studies related to the coefficient of discharge of trapezoidal labyrinth weir in an open channel have been presented in the present study. Group Method of Data Handling (GMDH) with the transfer function of quadratic polynomial has been used to predict the coefficient of discharge for the side trapezoidal labyrinth weir. A new model is developed for coefficient of discharge of labyrinth weir by regression method. Generalized models for predicting the coefficient of discharge for labyrinth weir using Group Method of Data Handling (GMDH) network have also been developed. The prediction based on GMDH model is more satisfactory than those given by traditional regression equations.

Keywords: discharge coefficient, group method of data handling, open channel, side labyrinth weir

Procedia PDF Downloads 159
23879 Integration of Resistivity and Seismic Refraction Using Combine Inversion for Ancient River Findings at Sungai Batu, Lembah Bujang, Malaysia

Authors: Rais Yusoh, Rosli Saad, Mokhtar Saidin, Fauzi Andika, Sabiu Bala Muhammad

Abstract:

Resistivity and seismic refraction profiling have become a common method in pre-investigations for visualizing subsurface structure. The integration of the methods could reduce an interpretation ambiguity. Both methods have their individual software packages for data inversion, but potential to combine certain geophysical methods are restricted; however, the research algorithms that have this functionality was existed and are evaluated personally. The interpretation of subsurface were improve by combining inversion data from both methods by influence each other models using closure coupling; thus, by implementing both methods to support each other which could improve the subsurface interpretation. These methods were applied on a field dataset from a pre-investigation for archeology in finding the ancient river. There were no major changes in the inverted model by combining data inversion for this archetype which probably due to complex geology. The combine data analysis provides an additional technique for interpretation such as an alluvium, which can have strong influence on the ancient river findings.

Keywords: ancient river, combine inversion, resistivity, seismic refraction

Procedia PDF Downloads 330
23878 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 59
23877 Localization of Frontal and Temporal Speech Areas in Brain Tumor Patients by Their Structural Connections with Probabilistic Tractography

Authors: B.Shukir, H.Woo, P.Barzo, D.Kis

Abstract:

Preoperative brain mapping in tumors involving the speech areas has an important role to reduce surgical risks. Functional magnetic resonance imaging (fMRI) is the gold standard method to localize cortical speech areas preoperatively, but its availability in clinical routine is difficult. Diffusion MRI based probabilistic tractography is available in head MRI. It’s used to segment cortical subregions by their structural connectivity. In our study, we used probabilistic tractography to localize the frontal and temporal cortical speech areas. 15 patients with left frontal tumor were enrolled to our study. Speech fMRI and diffusion MRI acquired preoperatively. The standard automated anatomical labelling atlas 3 (AAL3) cortical atlas used to define 76 left frontal and 118 left temporal potential speech areas. 4 types of tractography were run according to the structural connection of these regions to the left arcuate fascicle (FA) to localize those cortical areas which have speech functions: 1, frontal through FA; 2, frontal with FA; 3, temporal to FA; 4, temporal with FA connections were determined. Thresholds of 1%, 5%, 10% and 15% applied. At each level, the number of affected frontal and temporal regions by fMRI and tractography were defined, the sensitivity and specificity were calculated. At the level of 1% threshold showed the best results. Sensitivity was 61,631,4% and 67,1523,12%, specificity was 87,210,4% and 75,611,37% for frontal and temporal regions, respectively. From our study, we conclude that probabilistic tractography is a reliable preoperative technique to localize cortical speech areas. However, its results are not feasible that the neurosurgeon rely on during the operation.

Keywords: brain mapping, brain tumor, fMRI, probabilistic tractography

Procedia PDF Downloads 164
23876 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning

Authors: Arun Sanjel, Greg Speegle

Abstract:

Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.

Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC

Procedia PDF Downloads 103
23875 Evaluation of 18F Fluorodeoxyglucose Positron Emission Tomography, MRI, and Ultrasound in the Assessment of Axillary Lymph Node Metastases in Patients with Early Stage Breast Cancer

Authors: Wooseok Byon, Eunyoung Kim, Junseong Kwon, Byung Joo Song, Chan Heun Park

Abstract:

Purpose: 18F Fluorodeoxyglucose Positron Emission Tomography (FDG-PET) is a noninvasive imaging modality that can identify nodal metastases in women with primary breast cancer. The aim of this study was to compare the accuracy of FDG-PET with MRI and sonography scanning to determine axillary lymph node status in patients with breast cancer undergoing sentinel lymph node biopsy or axillary lymph node dissection. Patients and Methods: Between January and December 2012, ninety-nine patients with breast cancer and clinically negative axillary nodes were evaluated. All patients underwent FDG-PET, MRI, ultrasound followed by sentinel lymph node biopsy (SLNB) or axillary lymph node dissection (ALND). Results: Using axillary lymph node assessment as the gold standard, the sensitivity and specificity of FDG-PET were 51.4% (95% CI, 41.3% to 65.6%) and 92.2% (95% CI, 82.7% to 97.4%) respectively. The sensitivity and specificity of MRI and ultrasound were 57.1% (95% CI, 39.4% to 73.7%), 67.2% (95% CI, 54.3% to 78.4%) and 42.86% (95% CI, 26.3% to 60.7%), 92.2% (95% CI, 82.7% to 97.4%). Stratification according to hormone receptor status showed an increase in specificity when negative (FDG-PET: 42.3% to 77.8%, MRI 50% to 77.8%, ultrasound 34.6% to 66.7%). Also, positive HER2 status was associated with an increase in specificity (FDG-PET: 42.9% to 85.7%, MRI 50% to 85.7%, ultrasound 35.7% to 71.4%). Conclusions: The sensitivity and specificity of FDG-PET compared with MRI and ultrasound was high. However, FDG-PET is not sufficiently accurate to appropriately identify lymph node metastases. This study suggests that FDG-PET scanning cannot replace histologic staging in early-stage breast cancer, but might have a role in evaluating axillary lymph node status in hormone receptor negative or HER-2 overexpressing subtypes.

Keywords: axillary lymph node metastasis, FDG-PET, MRI, ultrasound

Procedia PDF Downloads 374
23874 A Novel Technological Approach to Maintaining the Cold Chain during Transportation

Authors: Philip J. Purnell

Abstract:

Innovators propose to use the Internet of Things to solve the problem of maintaining the cold chain during the transport of biopharmaceutical products. Sending a data logger with refrigerated goods is only useful to inform the recipient of the goods that they have either breached the cold chain and are therefore potentially spoiled or that they have not breached it and are therefore assumed to be in good condition. Connecting the data logger to the Internet of Things means that the supply chain manager will be informed in real-time of the exact location and the precise temperature of the material at any point on earth. Readable using a simple online interface, the supply chain manager will watch the progress of their material on a Google map together with accurate and crucially real-time temperature readings. The data logger will also send alarms to the supply chain manager if a cold chain breach becomes imminent allowing them time to contact the transporter and restore the cold chain before the material is affected. This development is expected to save billions of dollars in wasted biologics that currently arrive either spoiled or in an unreliable condition.

Keywords: internet of things, cold chain, data logger, transportation

Procedia PDF Downloads 439
23873 Comparison of Central Light Reflex Width-to-Retinal Vessel Diameter Ratio between Glaucoma and Normal Eyes by Using Edge Detection Technique

Authors: P. Siriarchawatana, K. Leungchavaphongse, N. Covavisaruch, K. Rojananuangnit, P. Boondaeng, N. Panyayingyong

Abstract:

Glaucoma is a disease that causes visual loss in adults. Glaucoma causes damage to the optic nerve and its overall pathophysiology is still not fully understood. Vasculopathy may be one of the possible causes of nerve damage. Photographic imaging of retinal vessels by fundus camera during eye examination may complement clinical management. This paper presents an innovation for measuring central light reflex width-to-retinal vessel diameter ratio (CRR) from digital retinal photographs. Using our edge detection technique, CRRs from glaucoma and normal eyes were compared to examine differences and associations. CRRs were evaluated on fundus photographs of participants from Mettapracharak (Wat Raikhing) Hospital in Nakhon Pathom, Thailand. Fifty-five photographs from normal eyes and twenty-one photographs from glaucoma eyes were included. Participants with hypertension were excluded. In each photograph, CRRs from four retinal vessels, including arteries and veins in the inferotemporal and superotemporal regions, were quantified using edge detection technique. From our finding, mean CRRs of all four retinal arteries and veins were significantly higher in persons with glaucoma than in those without glaucoma (0.34 vs. 0.32, p < 0.05 for inferotemporal vein, 0.33 vs. 0.30, p < 0.01 for inferotemporal artery, 0.34 vs. 0.31, p < 0.01 for superotemporal vein, and 0.33 vs. 0.30, p < 0.05 for superotemporal artery). From these results, an increase in CRRs of retinal vessels, as quantitatively measured from fundus photographs, could be associated with glaucoma.

Keywords: glaucoma, retinal vessel, central light reflex, image processing, fundus photograph, edge detection

Procedia PDF Downloads 323
23872 Characterization of the Pore System and Gas Storage Potential in Unconventional Reservoirs: A Case of Study of the Cretaceous la Luna Formation, Middle Magdalena Valley Basin, Colombia

Authors: Carlos Alberto Ríos-Reyes, Efraín Casadiego-Quintero

Abstract:

We propose a generalized workflow for mineralogy investigation of unconventional reservoirs using multi-scale imaging and pore-scale analyses. This workflow can be used for the integral evaluation of these resources. The Cretaceous La Luna Formation´s mudstones in the Middle Magdalena Valley Basin (Colombia) inherently show a heterogeneous pore system with organic and inorganic pores. For this reason, it is necessary to carry out the integration of high resolution 2D images of mapping by conventional petrography, scanning electron microscopy and quantitative evaluation of minerals by scanning electron microscopy to describe their organic and inorganic porosity to understand the transport mechanism through pores. The analyzed rocks show several pore types, including interparticle pores, organoporosity, intraparticle pores, intraparticle pores, and microchannels and/or microfractures. The existence of interconnected pores in pore system of these rocks promotes effective pathways for primary gas migration and storage space for residual hydrocarbons in mudstones, which is very useful in this type of gas reservoirs. It is crucial to understand not only the porous system of these rocks and their mineralogy but also to project the gas flow in order to design the appropriate strategies for the stimulation of unconventional reservoirs. Keywords: mudstones; La Luna Formation; gas storage; migration; hydrocarbon.

Keywords: mudstones, La luna formation, gas storage, migration, hydrocarbon

Procedia PDF Downloads 74
23871 Liquid Biopsy and Screening Biomarkers in Glioma Grading

Authors: Abdullah Abdu Qaseem Shamsan

Abstract:

Background: Gliomas represent the most frequent, heterogeneous group of tumors arising from glial cells, characterized by difficult monitoring, poor prognosis, and fatality. Tissue biopsy is an established procedure for tumor cell sampling that aids diagnosis, tumor grading, and prediction of prognosis. We studied and compared the levels of liquid biopsy markers in patients with different grades of glioma. Also, it tried to establish the potential association between glioma and specific blood groups antigen. Result: 78 patients were identified, among whom maximum percentage with glioblastoma possessed blood group O+ (53.8%). The second highest frequency had blood group A+ (20.4%), followed by B+ (9.0%) and A- (5.1%), and least with O-. Liquid biopsy biomarkers comprised of ALT, LDH, lymphocytes, Urea, Alkaline phosphatase, AST Neutrophils, and CRP. The levels of all the components increased significantly with the severity of glioma, with maximum levels seen in glioblastoma (grade IV), followed by grade III and grade II respectively. Conclusion: Gliomas possess significant clinical challenges due to their progression with heterogeneous nature and aggressive behavior. Liquid biopsy is a non-invasive approach which aids to establish the status of the patient and determine the tumor grade, therefore may show diagnostic and prognostic utility. Additionally, our study provides evidence to demonstrate the role of ABO blood group antigens in the development of glioma. However, future clinical research on liquid biopsy will improve the sensitivity and specificity of these tests and validate their clinical usefulness to guide treatment approaches.

Keywords: GBM: glioblastoma multiforme, CT: computed tomography, MRI: magnetic resonance imaging, ctRNA: circulating tumor RNA

Procedia PDF Downloads 50
23870 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana

Authors: Lidon Lashley

Abstract:

This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.

Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index

Procedia PDF Downloads 160
23869 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 388
23868 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 408
23867 Building an Integrated Relational Database from Swiss Nutrition National Survey and Swiss Health Datasets for Data Mining Purposes

Authors: Ilona Mewes, Helena Jenzer, Farshideh Einsele

Abstract:

Objective: The objective of the study was to integrate two big databases from Swiss nutrition national survey (menuCH) and Swiss health national survey 2012 for data mining purposes. Each database has a demographic base data. An integrated Swiss database is built to later discover critical food consumption patterns linked with lifestyle diseases known to be strongly tied with food consumption. Design: Swiss nutrition national survey (menuCH) with approx. 2000 respondents from two different surveys, one by Phone and the other by questionnaire along with Swiss health national survey 2012 with 21500 respondents were pre-processed, cleaned and finally integrated to a unique relational database. Results: The result of this study is an integrated relational database from the Swiss nutritional and health databases.

Keywords: health informatics, data mining, nutritional and health databases, nutritional and chronical databases

Procedia PDF Downloads 112
23866 Clustering Performance Analysis using New Correlation-Based Cluster Validity Indices

Authors: Nathakhun Wiroonsri

Abstract:

There are various cluster validity measures used for evaluating clustering results. One of the main objectives of using these measures is to seek the optimal unknown number of clusters. Some measures work well for clusters with different densities, sizes and shapes. Yet, one of the weaknesses that those validity measures share is that they sometimes provide only one clear optimal number of clusters. That number is actually unknown and there might be more than one potential sub-optimal option that a user may wish to choose based on different applications. We develop two new cluster validity indices based on a correlation between an actual distance between a pair of data points and a centroid distance of clusters that the two points are located in. Our proposed indices constantly yield several peaks at different numbers of clusters which overcome the weakness previously stated. Furthermore, the introduced correlation can also be used for evaluating the quality of a selected clustering result. Several experiments in different scenarios, including the well-known iris data set and a real-world marketing application, have been conducted to compare the proposed validity indices with several well-known ones.

Keywords: clustering algorithm, cluster validity measure, correlation, data partitions, iris data set, marketing, pattern recognition

Procedia PDF Downloads 102
23865 Analyzing Competitive Advantage of Internet of Things and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic hasnot only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of the normal design, construction, and operation of cities provides a unique opportunity to improve connection between people. Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the researchcontribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create a competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: internet of things, data analytics, smart cities, competitive advantage

Procedia PDF Downloads 94
23864 Social Data-Based Users Profiles' Enrichment

Authors: Amel Hannech, Mehdi Adda, Hamid Mcheick

Abstract:

In this paper, we propose a generic model of user profile integrating several elements that may positively impact the research process. We exploit the classical behavior of users and integrate a delimitation process of their research activities into several research sessions enriched with contextual and temporal information, which allows reflecting the current interests of these users in every period of time and infer data freshness. We argue that the annotation of resources gives more transparency on users' needs. It also strengthens social links among resources and users, and can so increase the scope of the user profile. Based on this idea, we integrate the social tagging practice in order to exploit the social users' behavior to enrich their profiles. These profiles are then integrated into a recommendation system in order to predict the interesting personalized items of users allowing to assist them in their researches and further enrich their profiles. In this recommendation, we provide users new research experiences.

Keywords: user profiles, topical ontology, contextual information, folksonomies, tags' clusters, data freshness, association rules, data recommendation

Procedia PDF Downloads 263
23863 A Comparative Study on Deep Learning Models for Pneumonia Detection

Authors: Hichem Sassi

Abstract:

Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.

Keywords: deep learning, computer vision, pneumonia, models, comparative study

Procedia PDF Downloads 64
23862 Spatial Interpolation of Aerosol Optical Depth Pollution: Comparison of Methods for the Development of Aerosol Distribution

Authors: Sahabeh Safarpour, Khiruddin Abdullah, Hwee San Lim, Mohsen Dadras

Abstract:

Air pollution is a growing problem arising from domestic heating, high density of vehicle traffic, electricity production, and expanding commercial and industrial activities, all increasing in parallel with urban population. Monitoring and forecasting of air quality parameters are important due to health impact. One widely available metric of aerosol abundance is the aerosol optical depth (AOD). The AOD is the integrated light extinction coefficient over a vertical atmospheric column of unit cross section, which represents the extent to which the aerosols in that vertical profile prevent the transmission of light by absorption or scattering. Seasonal aerosol optical depth (AOD) values at 550 nm derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor onboard NASA’s Terra satellites, for the 10 years period of 2000-2010 were used to test 7 different spatial interpolation methods in the present study. The accuracy of estimations was assessed through visual analysis as well as independent validation based on basic statistics, such as root mean square error (RMSE) and correlation coefficient. Based on the RMSE and R values of predictions made using measured values from 2000 to 2010, Radial Basis Functions (RBFs) yielded the best results for spring, summer, and winter and ordinary kriging yielded the best results for fall.

Keywords: aerosol optical depth, MODIS, spatial interpolation techniques, Radial Basis Functions

Procedia PDF Downloads 406
23861 Impact of External Temperature on the Speleothem Growth in the Moravian Karst

Authors: Frantisek Odvarka

Abstract:

Based on the data from the Moravian Karst, the influence of the calcite speleothem growth by selected meteorological factors was evaluated. External temperature was determined as one of the main factors influencing speleothem growth in Moravian Karst. This factor significantly influences the CO₂ concentration in soil/epikarst, and cave atmosphere in the Moravian Karst and significantly contributes to the changes in the CO₂ partial pressure differences between soil/epikarst and cave atmosphere in Moravian Karst, which determines the drip water supersaturation with respect to the calcite and quantity of precipitated calcite in the Moravian Karst cave environment. External air temperatures and cave air temperatures were measured using a COMET S3120 data logger, which can measure temperatures in the range from -30 to +80 °C with an accuracy of ± 0.4 °C. CO₂ concentrations in the cave and soils were measured with a FT A600 CO₂H Ahlborn probe (value range 0 ppmv to 10,000 ppmv, accuracy 1 ppmv), which was connected to the data logger ALMEMO 2290-4, V5 Ahlborn. The soil temperature was measured with a FHA646E1 Ahlborn probe (temperature range -20 to 70 °C, accuracy ± 0.4 °C) connected to an ALMEMO 2290-4 V5 Ahlborn data logger. The airflow velocities into and out of the cave were monitored by a FVA395 TH4 Thermo anemometer (speed range from 0.05 to 2 m s⁻¹, accuracy ± 0.04 m s⁻¹), which was connected to the ALMEMO 2590-4 V5 Ahlborn data logger for recording. The flow was measured in the lower and upper entrance of the Imperial Cave. The data were analyzed in MS Office Excel 2019 and PHREEQC.

Keywords: speleothem growth, carbon dioxide partial pressure, Moravian Karst, external temperature

Procedia PDF Downloads 144
23860 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia PDF Downloads 381
23859 iCount: An Automated Swine Detection and Production Monitoring System Based on Sobel Filter and Ellipse Fitting Model

Authors: Jocelyn B. Barbosa, Angeli L. Magbaril, Mariel T. Sabanal, John Paul T. Galario, Mikka P. Baldovino

Abstract:

The use of technology has become ubiquitous in different areas of business today. With the advent of digital imaging and database technology, business owners have been motivated to integrate technology to their business operation ranging from small, medium to large enterprises. Technology has been found to have brought many benefits that can make a business grow. Hog or swine raising, for example, is a very popular enterprise in the Philippines, whose challenges in production monitoring can be addressed through technology integration. Swine production monitoring can become a tedious task as the enterprise goes larger. Specifically, problems like delayed and inconsistent reports are most likely to happen if counting of swine per pen of which building is done manually. In this study, we present iCount, which aims to ensure efficient swine detection and counting that hastens the swine production monitoring task. We develop a system that automatically detects and counts swine based on Sobel filter and ellipse fitting model, given the still photos of the group of swine captured in a pen. We improve the Sobel filter detection result through 8-neigbhorhood rule implementation. Ellipse fitting technique is then employed for proper swine detection. Furthermore, the system can generate periodic production reports and can identify the specific consumables to be served to the swine according to schedules. Experiments reveal that our algorithm provides an efficient way for detecting swine, thereby providing a significant amount of accuracy in production monitoring.

Keywords: automatic swine counting, swine detection, swine production monitoring, ellipse fitting model, sobel filter

Procedia PDF Downloads 311
23858 Comparative Study of Greenhouse Locations through Satellite Images and Geographic Information System: Methodological Evaluation in Venezuela

Authors: Maria A. Castillo H., Andrés R. Leandro C.

Abstract:

During the last decades, agricultural productivity in Latin America has increased with precision agriculture and more efficient agricultural technologies. The use of automated systems, satellite images, geographic information systems, and tools for data analysis, and artificial intelligence have contributed to making more effective strategic decisions. Twenty years ago, the state of Mérida, located in the Venezuelan Andes, reported the largest area covered by greenhouses in the country, where certified seeds of potatoes, vegetables, ornamentals, and flowers were produced for export and consumption in the central region of the country. In recent years, it is estimated that production under greenhouses has changed, and the area covered has decreased due to different factors, but there are few historical statistical data in sufficient quantity and quality to support this estimate or to be used for analysis and decision making. The objective of this study is to compare data collected about geoposition, use, and covered areas of the greenhouses in 2007 to data available in 2021, as support for the analysis of the current situation of horticultural production in the main municipalities of the state of Mérida. The document presents the development of the work in the diagnosis and integration of geographic coordinates in GIS and data analysis phases. As a result, an evaluation of the process is made, a dashboard is presented with the most relevant data along with the geographical coordinates integrated into GIS, and an analysis of the obtained information is made. Finally, some recommendations for actions are added, and works that expand the information obtained and its geographical traceability over time are proposed. This study contributes to granting greater certainty in the supporting data for the evaluation of social, environmental, and economic sustainability indicators and to make better decisions according to the sustainable development goals in the area under review. At the same time, the methodology provides improvements to the agricultural data collection process that can be extended to other study areas and crops.

Keywords: greenhouses, geographic information system, protected agriculture, data analysis, Venezuela

Procedia PDF Downloads 91
23857 Modelling Consistency and Change of Social Attitudes in 7 Years of Longitudinal Data

Authors: Paul Campbell, Nicholas Biddle

Abstract:

There is a complex, endogenous relationship between individual circumstances, attitudes, and behaviour. This study uses longitudinal panel data to assess changes in social and political attitudes over a 7-year period. Attitudes are captured with the question 'what is the most important issue facing Australia today', collected at multiple time points in a longitudinal survey of 2200 Australians. Consistency of attitudes, and factors predicting change over time, are assessed. The consistency of responses has methodological implications for data collection, specifically how often such questions ought to be asked of a population. When change in attitude is observed, this study assesses the extent to which individual demographic characteristics, personality traits, and broader societal events predict change.

Keywords: attitudes, longitudinal survey analysis, personality, social values

Procedia PDF Downloads 130
23856 Data Protection and Regulation Compliance on Handling Physical Child Abuse Scenarios- A Scoping Review

Authors: Ana Mafalda Silva, Rebeca Fontes, Ana Paula Vaz, Carla Carreira, Ana Corte-Real

Abstract:

Decades of research on the topic of interpersonal violence against minors highlight five main conclusions: 1) it causes harmful effects on children's development and health; 2) it is prevalent; 3) it violates children's rights; 4) it can be prevented and 5) parents are the main aggressors. The child abuse scenario is identified through clinical observation, administrative data and self-reports. The most used instruments are self-reports; however, there are no valid and reliable self-report instruments for minors, which consist of a retrospective interpretation of the situation by the victim already in her adult phase and/or by her parents. Clinical observation and collection of information, namely from the orofacial region, are essential in the early identification of these situations. The management of medical data, such as personal data, must comply with the General Data Protection Regulation (GDPR), in Europe, and with the General Law of Data Protection (LGPD), in Brazil. This review aims to answer the question: In a situation of medical assistance to minors, in the suspicion of interpersonal violence, due to mistreatment, is it necessary for the guardians to provide consent in the registration and sharing of personal data, namely medical ones. A scoping review was carried out based on a search by the Web of Science and Pubmed search engines. Four papers and two documents from the grey literature were selected. As found, the process of identifying and signaling child abuse by the health professional, and the necessary early intervention in defense of the minor as a victim of abuse, comply with the guidelines expressed in the GDPR and LGPD. This way, the notification in maltreatment scenarios by health professionals should be a priority and there shouldn’t be the fear or anxiety of legal repercussions that stands in the way of collecting and treating the data necessary for the signaling procedure that safeguards and promotes the welfare of children living with abuse.

Keywords: child abuse, disease notifications, ethics, healthcare assistance

Procedia PDF Downloads 93
23855 A Generic Middleware to Instantly Sync Intensive Writes of Heterogeneous Massive Data via Internet

Authors: Haitao Yang, Zhenjiang Ruan, Fei Xu, Lanting Xia

Abstract:

Industry data centers often need to sync data changes reliably and instantly from a large-scale of heterogeneous autonomous relational databases accessed via the not-so-reliable Internet, for which a practical universal sync middle of low maintenance and operation costs is most wanted, but developing such a product and adapting it for various scenarios are a very sophisticated and continuous practice. The authors have been devising, applying, and optimizing a generic sync middleware system, named GSMS since 2006, holding the principles or advantages that the middleware must be SyncML-compliant and transparent to data application layer logic, need not refer to implementation details of databases synced, does not rely on host computer operating systems deployed, and its construction is light weighted and hence, of low cost. A series of ultimate experiments with GSMS sync performance were conducted for a persuasive example of a source relational database that underwent a broad range of write loads, say, from one thousand to one million intensive writes within a few minutes. The tests proved that GSMS has achieved an instant sync level of well below a fraction of millisecond per record sync, and GSMS’ smooth performances under ultimate write loads also showed it is feasible and competent.

Keywords: heterogeneous massive data, instantly sync intensive writes, Internet generic middleware design, optimization

Procedia PDF Downloads 119
23854 Building Transparent Supply Chains through Digital Tracing

Authors: Penina Orenstein

Abstract:

In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.

Keywords: data mining, supply chain, empirical research, data mapping

Procedia PDF Downloads 174
23853 Synoptic Analysis of a Heavy Flood in the Province of Sistan-Va-Balouchestan: Iran January 2020

Authors: N. Pegahfar, P. Ghafarian

Abstract:

In this research, the synoptic weather conditions during the heavy flood of 10-12 January 2020 in the Sistan-va-Balouchestan Province of Iran will be analyzed. To this aim, reanalysis data from the National Centers for Environmental Prediction (NCEP) and National Center for Atmospheric Research (NCAR), NCEP Global Forecasting System (GFS) analysis data, measured data from a surface station together with satellite images from the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) have been used from 9 to 12 January 2020. Atmospheric parameters both at the lower troposphere and also at the upper part of that have been used, including absolute vorticity, wind velocity, temperature, geopotential height, relative humidity, and precipitation. Results indicated that both lower-level and upper-level currents were strong. In addition, the transport of a large amount of humidity from the Oman Sea and the Red Sea to the south and southeast of Iran (Sistan-va-Balouchestan Province) led to the vast and unexpected precipitation and then a heavy flood.

Keywords: Sistan-va-Balouchestn Province, heavy flood, synoptic, analysis data

Procedia PDF Downloads 101