Search results for: secure data aggregation
24182 Design and Implementation a Virtualization Platform for Providing Smart Tourism Services
Authors: Nam Don Kim, Jungho Moon, Tae Yun Chung
Abstract:
This paper proposes an Internet of Things (IoT) based virtualization platform for providing smart tourism services. The virtualization platform provides a consistent access interface to various types of data by naming IoT devices and legacy information systems as pathnames in a virtual file system. In the other words, the IoT virtualization platform functions as a middleware which uses the metadata for underlying collected data. The proposed platform makes it easy to provide customized tourism information by using tourist locations collected by IoT devices and additionally enables to create new interactive smart tourism services focused on the tourist locations. The proposed platform is very efficient so that the provided tourism services are isolated from changes in raw data and the services can be modified or expanded without changing the underlying data structure.Keywords: internet of things (IoT), IoT platform, serviceplatform, virtual file system (VSF)
Procedia PDF Downloads 50624181 A Review on 3D Smart City Platforms Using Remotely Sensed Data to Aid Simulation and Urban Analysis
Authors: Slim Namouchi, Bruno Vallet, Imed Riadh Farah
Abstract:
3D urban models provide powerful tools for decision making, urban planning, and smart city services. The accuracy of this 3D based systems is directly related to the quality of these models. Since manual large-scale modeling, such as cities or countries is highly time intensive and very expensive process, a fully automatic 3D building generation is needed. However, 3D modeling process result depends on the input data, the proprieties of the captured objects, and the required characteristics of the reconstructed 3D model. Nowadays, producing 3D real-world model is no longer a problem. Remotely sensed data had experienced a remarkable increase in the recent years, especially data acquired using unmanned aerial vehicles (UAV). While the scanning techniques are developing, the captured data amount and the resolution are getting bigger and more precise. This paper presents a literature review, which aims to identify different methods of automatic 3D buildings extractions either from LiDAR or the combination of LiDAR and satellite or aerial images. Then, we present open source technologies, and data models (e.g., CityGML, PostGIS, Cesiumjs) used to integrate these models in geospatial base layers for smart city services.Keywords: CityGML, LiDAR, remote sensing, SIG, Smart City, 3D urban modeling
Procedia PDF Downloads 13724180 Structural Damage Detection via Incomplete Model Data Using Output Data Only
Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan
Abstract:
Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation
Procedia PDF Downloads 36624179 Spontaneous Message Detection of Annoying Situation in Community Networks Using Mining Algorithm
Authors: P. Senthil Kumari
Abstract:
Main concerns in data mining investigation are social controls of data mining for handling ambiguity, noise, or incompleteness on text data. We describe an innovative approach for unplanned text data detection of community networks achieved by classification mechanism. In a tangible domain claim with humble secrecy backgrounds provided by community network for evading annoying content is presented on consumer message partition. To avoid this, mining methodology provides the capability to unswervingly switch the messages and similarly recover the superiority of ordering. Here we designated learning-centered mining approaches with pre-processing technique to complete this effort. Our involvement of work compact with rule-based personalization for automatic text categorization which was appropriate in many dissimilar frameworks and offers tolerance value for permits the background of comments conferring to a variety of conditions associated with the policy or rule arrangements processed by learning algorithm. Remarkably, we find that the choice of classifier has predicted the class labels for control of the inadequate documents on community network with great value of effect.Keywords: text mining, data classification, community network, learning algorithm
Procedia PDF Downloads 51024178 Sovereign Debt Restructuring: A Study of the Inadequacies of the Contractual Approach
Authors: Salamah Ansari
Abstract:
In absence of a comprehensive international legal regime for sovereign debt restructuring, majority of the complications arising from sovereign debt restructuring are frequently left to the uncertain market forces. The resort to market forces for sovereign debt restructuring has led to a phenomenal increase in litigations targeting assets of defaulting sovereign nations, internationally across jurisdictions with the first major wave of lawsuits against sovereigns in the 1980s with the Latin American crisis. Recent experiences substantiate that majority of obstacles faced during sovereign debt restructuring process are caused by inefficient creditor coordination and collective action problems. Collective action problems manifest as grab race, rush to exits, holdouts, the free rider problem and the rush to the courthouse. On defaulting, for a nation to successfully restructure its debt, all the creditors involved must accept some reduction in the value of their claims. As a single holdout creditor has the potential to undermine the restructuring process, hold-out creditors are snowballing with the increasing probability of earning high returns through litigations. This necessitates a mechanism to avoid holdout litigations and reinforce collective action on the part of the creditor. This can be done either through a statutory reform or through market-based contractual approach. In absence of an international sovereign bankruptcy regime, the impetus is mostly on inclusion of collective action clauses in debt contracts. The preference to contractual mechanisms vis- a vis a statutory approach can be explained with numerous reasons, but that's only part of the puzzle in trying to understand the economics of the underlying system. The contractual approach proposals advocate the inclusion of certain clauses in the debt contract for an orderly debt restructuring. These include clauses such as majority voting clauses, sharing clauses, non- acceleration clauses, initiation clauses, aggregation clauses, temporary stay on litigation clauses, priority financing clauses, and complete revelation of relevant information. However, voluntary market based contractual approach to debt workouts has its own complexities. It is a herculean task to enshrine clauses in debt contracts that are detailed enough to create an orderly debt restructuring mechanism while remaining attractive enough for creditors. Introduction of collective action clauses into debt contracts can reduce the barriers in efficient debt restructuring and also have the potential to improve the terms on which sovereigns are able to borrow. However, it should be borne in mind that such clauses are not a panacea to the huge institutional inadequacy that persists and may lead to worse restructuring outcomes.Keywords: sovereign debt restructuring, collective action clauses, hold out creditors, litigations
Procedia PDF Downloads 15824177 Expanding the Evaluation Criteria for a Wind Turbine Performance
Authors: Ivan Balachin, Geanette Polanco, Jiang Xingliang, Hu Qin
Abstract:
The problem of global warming raised up interest towards renewable energy sources. To reduce cost of wind energy is a challenge. Before building of wind park conditions such as: average wind speed, direction, time for each wind, probability of icing, must be considered in the design phase. Operation values used on the setting of control systems also will depend on mentioned variables. Here it is proposed a procedure to be include in the evaluation of the performance of a wind turbine, based on the amplitude of wind changes, the number of changes and their duration. A generic study case based on actual data is presented. Data analysing techniques were applied to model the power required for yaw system based on amplitude and data amount of wind changes. A theoretical model between time, amplitude of wind changes and angular speed of nacelle rotation was identified.Keywords: field data processing, regression determination, wind turbine performance, wind turbine placing, yaw system losses
Procedia PDF Downloads 39224176 An Exhaustive All-Subsets Examination of Trade Theory on WTO Data
Authors: Masoud Charkhabi
Abstract:
We examine trade theory with this motivation. The full set of World Trade Organization data are organized into country-year pairs, each treated as a different entity. Topological Data Analysis reveals that among the 16 region and 240 region-year pairs there exists in fact a distinguishable group of region-period pairs. The generally accepted periods of shifts from dissimilar-dissimilar to similar-similar trade in goods among regions are examined from this new perspective. The period breaks are treated as cumulative and are flexible. This type of all-subsets analysis is motivated from computer science and is made possible with Lossy Compression and Graph Theory. The results question many patterns in similar-similar to dissimilar-dissimilar trade. They also show indications of economic shifts that only later become evident in other economic metrics.Keywords: econometrics, globalization, network science, topological data, analysis, trade theory, visualization, world trade
Procedia PDF Downloads 37524175 Using Probe Person Data for Travel Mode Detection
Authors: Muhammad Awais Shafique, Eiji Hato, Hideki Yaginuma
Abstract:
Recently GPS data is used in a lot of studies to automatically reconstruct travel patterns for trip survey. The aim is to minimize the use of questionnaire surveys and travel diaries so as to reduce their negative effects. In this paper data acquired from GPS and accelerometer embedded in smart phones is utilized to predict the mode of transportation used by the phone carrier. For prediction, Support Vector Machine (SVM) and Adaptive boosting (AdaBoost) are employed. Moreover a unique method to improve the prediction results from these algorithms is also proposed. Results suggest that the prediction accuracy of AdaBoost after improvement is relatively better than the rest.Keywords: accelerometer, AdaBoost, GPS, mode prediction, support vector machine
Procedia PDF Downloads 36224174 Building Energy Modeling for Networks of Data Centers
Authors: Eric Kumar, Erica Cochran, Zhiang Zhang, Wei Liang, Ronak Mody
Abstract:
The objective of this article was to create a modelling framework that exposes the marginal costs of shifting workloads across geographically distributed data-centers. Geographical distribution of internet services helps to optimize their performance for localized end users with lowered communications times and increased availability. However, due to the geographical and temporal effects, the physical embodiments of a service's data center infrastructure can vary greatly. In this work, we first identify that the sources of variances in the physical infrastructure primarily stem from local weather conditions, specific user traffic profiles, energy sources, and the types of IT hardware available at the time of deployment. Second, we create a traffic simulator that indicates the IT load at each data-center in the set as an approximator for user traffic profiles. Third, we implement a framework that quantifies the global level energy demands using building energy models and the traffic profiles. The results of the model provide a time series of energy demands that can be used for further life cycle analysis of internet services.Keywords: data-centers, energy, life cycle, network simulation
Procedia PDF Downloads 14824173 Predicting National Football League (NFL) Match with Score-Based System
Authors: Marcho Setiawan Handok, Samuel S. Lemma, Abdoulaye Fofana, Naseef Mansoor
Abstract:
This paper is proposing a method to predict the outcome of the National Football League match with data from 2019 to 2022 and compare it with other popular models. The model uses open-source statistical data of each team, such as passing yards, rushing yards, fumbles lost, and scoring. Each statistical data has offensive and defensive. For instance, a data set of anticipated values for a specific matchup is created by comparing the offensive passing yards obtained by one team to the defensive passing yards given by the opposition. We evaluated the model’s performance by contrasting its result with those of established prediction algorithms. This research is using a neural network to predict the score of a National Football League match and then predict the winner of the game.Keywords: game prediction, NFL, football, artificial neural network
Procedia PDF Downloads 8724172 Assimilating Multi-Mission Satellites Data into a Hydrological Model
Authors: Mehdi Khaki, Ehsan Forootan, Joseph Awange, Michael Kuhn
Abstract:
Terrestrial water storage, as a source of freshwater, plays an important role in human lives. Hydrological models offer important tools for simulating and predicting water storages at global and regional scales. However, their comparisons with 'reality' are imperfect mainly due to a high level of uncertainty in input data and limitations in accounting for all complex water cycle processes, uncertainties of (unknown) empirical model parameters, as well as the absence of high resolution (both spatially and temporally) data. Data assimilation can mitigate this drawback by incorporating new sets of observations into models. In this effort, we use multi-mission satellite-derived remotely sensed observations to improve the performance of World-Wide Water Resources Assessment system (W3RA) hydrological model for estimating terrestrial water storages. For this purpose, we assimilate total water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) and surface soil moisture data from the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) into W3RA. This is done to (i) improve model estimations of water stored in ground and soil moisture, and (ii) assess the impacts of each satellite of data (from GRACE and AMSR-E) and their combination on the final terrestrial water storage estimations. These data are assimilated into W3RA using the Ensemble Square-Root Filter (EnSRF) filtering technique over Mississippi Basin (the United States) and Murray-Darling Basin (Australia) between 2002 and 2013. In order to evaluate the results, independent ground-based groundwater and soil moisture measurements within each basin are used.Keywords: data assimilation, GRACE, AMSR-E, hydrological model, EnSRF
Procedia PDF Downloads 29124171 Save Lives: The Application of Geolocation-Awareness Service in Iranian Pre-hospital EMS Information Management System
Authors: Somayeh Abedian, Pirhossein Kolivand, Hamid Reza Lornejad, Amin Karampour, Ebrahim Keshavarz Safari
Abstract:
For emergency and relief service providers such as pre-hospital emergencies, quick arrival at the scene of an accident or any EMS mission is one of the most important requirements of effective service delivery. Response time (the interval between the time of the call and the time of arrival on scene) is a critical factor in determining the quality of pre-hospital Emergency Medical Services (EMS). This is especially important for heart attack, stroke, or accident patients. Location-based e-services can be broadly defined as any service that provides information pertinent to the current location of an active mobile handset or precise address of landline phone call at a specific time window, regardless of the underlying delivery technology used to convey the information. According to research, one of the effective methods of meeting this goal is determining the location of the caller via the cooperation of landline and mobile phone operators in the country. The follow-up of the Communications Regulatory Authority (CRA) organization has resulted in the receipt of two separate secured electronic web services. Thus, to ensure human privacy, a secure technical architecture was required for launching the services in the pre-hospital EMS information management system. In addition, to quicken medics’ arrival at the patient's bedside, rescue vehicles should make use of an intelligent transportation system to estimate road traffic using a GPS-based mobile navigation system independent of the Internet. This paper seeks to illustrate the architecture of the practical national model used by the Iranian EMS organization.Keywords: response time, geographic location inquiry service (GLIS), location-based service (LBS), emergency medical services information system (EMSIS)
Procedia PDF Downloads 17224170 Development and Power Characterization of an IoT Network for Agricultural Imaging Applications
Authors: Jacob Wahl, Jane Zhang
Abstract:
This paper describes the development and characterization of a prototype IoT network for use with agricultural imaging and monitoring applications. The sensor and gateway nodes are designed using the ESP32 SoC with integrated Bluetooth Low Energy 4.2 and Wi-Fi. A development board, the Arducam IoTai ESP32, is used for prototyping, testing, and power measurements. Google’s Firebase is used as the cloud storage site for image data collected by the sensor. The sensor node captures images using the OV2640 2MP camera module and transmits the image data to the gateway via Bluetooth Low Energy. The gateway then uploads the collected images to Firebase via a known nearby Wi-Fi network connection. This image data can then be processed and analyzed by computer vision and machine learning pipelines to assess crop growth or other needs. The sensor node achieves a wireless transmission data throughput of 220kbps while consuming 150mA of current; the sensor sleeps at 162µA. The sensor node device lifetime is estimated to be 682 days on a 6600mAh LiPo battery while acquiring five images per day based on the development board power measurements. This network can be utilized by any application that requires high data rates, low power consumption, short-range communication, and large amounts of data to be transmitted at low-frequency intervals.Keywords: Bluetooth low energy, ESP32, firebase cloud, IoT, smart farming
Procedia PDF Downloads 14224169 Hidden Hot Spots: Identifying and Understanding the Spatial Distribution of Crime
Authors: Lauren C. Porter, Andrew Curtis, Eric Jefferis, Susanne Mitchell
Abstract:
A wealth of research has been generated examining the variation in crime across neighborhoods. However, there is also a striking degree of crime concentration within neighborhoods. A number of studies show that a small percentage of street segments, intersections, or addresses account for a large portion of crime. Not surprisingly, a focus on these crime hot spots can be an effective strategy for reducing community level crime and related ills, such as health problems. However, research is also limited in an important respect. Studies tend to use official data to identify hot spots, such as 911 calls or calls for service. While the use of call data may be more representative of the actual level and distribution of crime than some other official measures (e.g. arrest data), call data still suffer from the 'dark figure of crime.' That is, there is most certainly a degree of error between crimes that occur versus crimes that are reported to the police. In this study, we present an alternative method of identifying crime hot spots, that does not rely on official data. In doing so, we highlight the potential utility of neighborhood-insiders to identify and understand crime dynamics within geographic spaces. Specifically, we use spatial video and geo-narratives to record the crime insights of 36 police, ex-offenders, and residents of a high crime neighborhood in northeast Ohio. Spatial mentions of crime are mapped to identify participant-identified hot spots, and these are juxtaposed with calls for service (CFS) data. While there are bound to be differences between these two sources of data, we find that one location, in particular, a corner store, emerges as a hot spot for all three groups of participants. Yet it does not emerge when we examine CFS data. A closer examination of the space around this corner store and a qualitative analysis of narrative data reveal important clues as to why this store may indeed be a hot spot, but not generate disproportionate calls to the police. In short, our results suggest that researchers who rely solely on official data to study crime hot spots may risk missing some of the most dangerous places.Keywords: crime, narrative, video, neighborhood
Procedia PDF Downloads 24124168 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions
Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla
Abstract:
With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect
Procedia PDF Downloads 4224167 Cross-Comparison between Land Surface Temperature from Polar and Geostationary Satellite over Heterogenous Landscape: A Case Study in Hong Kong
Authors: Ibrahim A. Adeniran, Rui F. Zhu, Man S. Wong
Abstract:
Owing to the insufficiency in the spatial representativeness and continuity of in situ temperature measurements from weather stations (WS), the use of temperature measurement from WS for large-range diurnal analysis in heterogenous landscapes has been limited. This has made the accurate estimation of land surface temperature (LST) from remotely sensed data more crucial. Moreover, the study of dynamic interaction between the atmosphere and the physical surface of the Earth could be enhanced at both annual and diurnal scales by using optimal LST data derived from satellite sensors. The tradeoff between the spatial and temporal resolution of LSTs from satellite’s thermal infrared sensors (TIRS) has, however, been a major challenge, especially when high spatiotemporal LST data are recommended. It is well-known from existing literature that polar satellites have the advantage of high spatial resolution, while geostationary satellites have a high temporal resolution. Hence, this study is aimed at designing a framework for the cross-comparison of LST data from polar and geostationary satellites in a heterogeneous landscape. This could help to understand the relationship between the LST estimates from the two satellites and, consequently, their integration in diurnal LST analysis. Landsat-8 satellite data will be used as the representative of the polar satellite due to the availability of its long-term series, while the Himawari-8 satellite will be used as the data source for the geostationary satellite because of its improved TIRS. For the study area, Hong Kong Special Administrative Region (HK SAR) will be selected; this is due to the heterogeneity in the landscape of the region. LST data will be retrieved from both satellites using the Split window algorithm (SWA), and the resulting data will be validated by comparing satellite-derived LST data with temperature data from automatic WS in HK SAR. The LST data from the satellite data will then be separated based on the land use classification in HK SAR using the Global Land Cover by National Mapping Organization version3 (GLCNMO 2013) data. The relationship between LST data from Landsat-8 and Himawari-8 will then be investigated based on the land-use class and over different seasons of the year in order to account for seasonal variation in their relationship. The resulting relationship will be spatially and statistically analyzed and graphically visualized for detailed interpretation. Findings from this study will reveal the relationship between the two satellite data based on the land use classification within the study area and the seasons of the year. While the information provided by this study will help in the optimal combination of LST data from Polar (Landsat-8) and geostationary (Himawari-8) satellites, it will also serve as a roadmap in the annual and diurnal urban heat (UHI) analysis in Hong Kong SAR.Keywords: automatic weather station, Himawari-8, Landsat-8, land surface temperature, land use classification, split window algorithm, urban heat island
Procedia PDF Downloads 7724166 Microarray Data Visualization and Preprocessing Using R and Bioconductor
Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava
Abstract:
Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.Keywords: microarray analysis, R language, affymetrix visualization, bioconductor
Procedia PDF Downloads 48024165 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution
Authors: Najrullah Khan, Athar Ali Khan
Abstract:
The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation
Procedia PDF Downloads 53724164 Machine Learning Application in Shovel Maintenance
Authors: Amir Taghizadeh Vahed, Adithya Thaduri
Abstract:
Shovels are the main components in the mining transportation system. The productivity of the mines depends on the availability of shovels due to its high capital and operating costs. The unplanned failure/shutdowns of a shovel results in higher repair costs, increase in downtime, as well as increasing indirect cost (i.e. loss of production and company’s reputation). In order to mitigate these failures, predictive maintenance can be useful approach using failure prediction. The modern mining machinery or shovels collect huge datasets automatically; it consists of reliability and maintenance data. However, the gathered datasets are useless until the information and knowledge of data are extracted. Machine learning as well as data mining, which has a major role in recent studies, has been used for the knowledge discovery process. In this study, data mining and machine learning approaches are implemented to detect not only anomalies but also patterns from a dataset and further detection of failures.Keywords: maintenance, machine learning, shovel, conditional based monitoring
Procedia PDF Downloads 22324163 Standard Languages for Creating a Database to Display Financial Statements on a Web Application
Authors: Vladimir Simovic, Matija Varga, Predrag Oreski
Abstract:
XHTML and XBRL are the standard languages for creating a database for the purpose of displaying financial statements on web applications. Today, XBRL is one of the most popular languages for business reporting. A large number of countries in the world recognize the role of XBRL language for financial reporting and the benefits that the reporting format provides in the collection, analysis, preparation, publication and the exchange of data (information) which is the positive side of this language. Here we present all advantages and opportunities that a company may have by using the XBRL format for business reporting. Also, this paper presents XBRL and other languages that are used for creating the database, such XML, XHTML, etc. The role of the AJAX complex model and technology will be explained in detail, and during the exchange of financial data between the web client and web server. Here will be mentioned basic layers of the network for data exchange via the web.Keywords: XHTML, XBRL, XML, JavaScript, AJAX technology, data exchange
Procedia PDF Downloads 39724162 Analyze and Visualize Eye-Tracking Data
Authors: Aymen Sekhri, Emmanuel Kwabena Frimpong, Bolaji Mubarak Ayeyemi, Aleksi Hirvonen, Matias Hirvonen, Tedros Tesfay Andemichael
Abstract:
Fixation identification, which involves isolating and identifying fixations and saccades in eye-tracking protocols, is an important aspect of eye-movement data processing that can have a big impact on higher-level analyses. However, fixation identification techniques are frequently discussed informally and rarely compared in any meaningful way. With two state-of-the-art algorithms, we will implement fixation detection and analysis in this work. The velocity threshold fixation algorithm is the first algorithm, and it identifies fixation based on a threshold value. For eye movement detection, the second approach is U'n' Eye, a deep neural network algorithm. The goal of this project is to analyze and visualize eye-tracking data from an eye gaze dataset that has been provided. The data was collected in a scenario in which individuals were shown photos and asked whether or not they recognized them. The results of the two-fixation detection approach are contrasted and visualized in this paper.Keywords: human-computer interaction, eye-tracking, CNN, fixations, saccades
Procedia PDF Downloads 13924161 Privacy Rights of Children in the Social Media Sphere: The Benefits and Challenges Under the EU and US Legislative Framework
Authors: Anna Citterbergova
Abstract:
This study explores the safeguards and guarantees to children’s personal data protection under the current EU and US legislative framework, namely the GDPR (2018) and COPPA (2000). Considering that children are online for the majority of their free time, one cannot overlook the negative side effects that may be associated with online participation, which may put children’s wellbeing and their fundamental rights at risk. The question of whether the current relevant legislative framework in relation to the responsibilities of the internet service providers (ISPs) are adequate safeguards and guarantees to children’s personal data protection has been an evolving debate both in the US and in the EU. From a children’s rights perspective, processors of personal data have certain obligations that must meet the international human rights principles (e. g. the CRC, ECHR), which require taking into account the best interest of the child. Accordingly, the need to protect children’s privacy online remains strong and relevant with the expansion of the number and importance of social media platforms to human life. At the same time, the landscape of the internet is rapidly evolving, and commercial interests are taking a more targeted approach in seeking children’s data. Therefore, it is essential to constantly evaluate the ongoing and evolving newly adopted market policies of ISPs that may misuse the gap in the current letter of the law. Previous studies in the field have already pointed out that both GDPR and COPPA may theoretically not be sufficient in protecting children’s personal data. With the focus on social media platforms, this study uses the doctrinal-descriptive method to identifiy the mechanisms enshrined in the GDPR and COPPA designed to protect children’s personal data. In its second part, the study includes a data gathering phase by the national data protection authorities responsible for monitoring and supervision of the GDPR in relation to children’s personal data protection who monitor the enforcement of the data protection rules throughout the European Union an contribute to their consistent application. These gathered primary source of data will later be used to outline the series of benefits and challenges to children’s persona lata protection faced by these institutes and the analysis that aims to suggest if and/or how to hold ISPs accountable while striking a fair balance between the commercial rights and the right to protection of the personal data of children. The preliminary results can be divided into two categories. First, conclusions in the doctrinal-descriptive part of the study. Second, specific cases and situations from the practice of national data protection authorities. While for the first part, concrete conclusions can already be presented, the second part is currently still in the data gathering phase. The result of this research is a comprehensive analysis on the safeguards and guarantees to children’s personal data protection under the current EU and US legislative framework, based on doctrinal-descriptive approach and original empirical data.Keywords: personal data of children, personal data protection, GDPR, COPPA, ISPs, social media
Procedia PDF Downloads 9824160 Modelling the Education Supply Chain with Network Data Envelopment Analysis
Authors: Sourour Ramzi, Claudia Sarrico
Abstract:
Little has been done on network DEA in education, and nobody has attempted to model the whole education supply chain using network DEA. As such the contribution of the present paper is to propose a model for measuring the efficiency of education supply chains using network DEA. First, we use a general survey of data envelopment analysis (DEA) to establish the emergent themes for research in DEA, and focus on the theme of Network DEA. Second, we use a survey on two-stage DEA models, and Network DEA to write a state of the art on Network DEA, particularly applied to supply chain management. Third, we use a survey on DEA applications to establish the most influential papers on DEA education applications, in order to establish the state of the art on applications of DEA in education, in general, and applications of DEA to education using network DEA, in particular. Finally, we propose a model for measuring the performance of education supply chains of different education systems (countries or states within a country, for instance). We then use this model on some empirical data.Keywords: supply chain, education, data envelopment analysis, network DEA
Procedia PDF Downloads 37124159 Magnetic Navigation of Nanoparticles inside a 3D Carotid Model
Authors: E. G. Karvelas, C. Liosis, A. Theodorakakos, T. E. Karakasidis
Abstract:
Magnetic navigation of the drug inside the human vessels is a very important concept since the drug is delivered to the desired area. Consequently, the quantity of the drug required to reach therapeutic levels is being reduced while the drug concentration at targeted sites is increased. Magnetic navigation of drug agents can be achieved with the use of magnetic nanoparticles where anti-tumor agents are loaded on the surface of the nanoparticles. The magnetic field that is required to navigate the particles inside the human arteries is produced by a magnetic resonance imaging (MRI) device. The main factors which influence the efficiency of the usage of magnetic nanoparticles for biomedical applications in magnetic driving are the size and the magnetization of the biocompatible nanoparticles. In this study, a computational platform for the simulation of the optimal gradient magnetic fields for the navigation of magnetic nanoparticles inside a carotid artery is presented. For the propulsion model of the particles, seven major forces are considered, i.e., the magnetic force from MRIs main magnet static field as well as the magnetic field gradient force from the special propulsion gradient coils. The static field is responsible for the aggregation of nanoparticles, while the magnetic gradient contributes to the navigation of the agglomerates that are formed. Moreover, the contact forces among the aggregated nanoparticles and the wall and the Stokes drag force for each particle are considered, while only spherical particles are used in this study. In addition, gravitational forces due to gravity and the force due to buoyancy are included. Finally, Van der Walls force and Brownian motion are taken into account in the simulation. The OpenFoam platform is used for the calculation of the flow field and the uncoupled equations of particles' motion. To verify the optimal gradient magnetic fields, a covariance matrix adaptation evolution strategy (CMAES) is used in order to navigate the particles into the desired area. A desired trajectory is inserted into the computational geometry, which the particles are going to be navigated in. Initially, the CMAES optimization strategy provides the OpenFOAM program with random values of the gradient magnetic field. At the end of each simulation, the computational platform evaluates the distance between the particles and the desired trajectory. The present model can simulate the motion of particles when they are navigated by the magnetic field that is produced by the MRI device. Under the influence of fluid flow, the model investigates the effect of different gradient magnetic fields in order to minimize the distance of particles from the desired trajectory. In addition, the platform can navigate the particles into the desired trajectory with an efficiency between 80-90%. On the other hand, a small number of particles are stuck to the walls and remains there for the rest of the simulation.Keywords: artery, drug, nanoparticles, navigation
Procedia PDF Downloads 10824158 Online Shopping vs Privacy – Results of an Experimental Study
Authors: Andrzej Poszewiecki
Abstract:
The presented paper contributes to the experimental current of research on privacy. The question of privacy is being discussed at length at present, primarily among lawyers and politicians. However, the matter of privacy has been of interest for economists for some time as well. The valuation of privacy by people is of great importance now. This article is about how people valuate their privacy. An experimental method has been utilised in the conducted research – the survey was carried out among customers of an online store, and the studied issue was whether their readiness to sell their data (WTA) was different from the willingness to buy data back (WTP). The basic aim of this article is to analyse whether people shopping on the Internet differentiate their privacy depending on whether they protect or sell it. The achieved results indicate the presence of major differences in this respect, which do not always come up with the original expectations. The obtained results have supported the hypothesis that people are more willing to sell their data than to repurchase them. However, the hypothesis that the value of proposed remuneration affects the willingness to sell/buy back personal data (one’s privacy) has not been supported.Keywords: privacy, experimental economics, behavioural economics, internet
Procedia PDF Downloads 29524157 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being
Procedia PDF Downloads 7424156 Use Cloud-Based Watson Deep Learning Platform to Train Models Faster and More Accurate
Authors: Susan Diamond
Abstract:
Machine Learning workloads have traditionally been run in high-performance computing (HPC) environments, where users log in to dedicated machines and utilize the attached GPUs to run training jobs on huge datasets. Training of large neural network models is very resource intensive, and even after exploiting parallelism and accelerators such as GPUs, a single training job can still take days. Consequently, the cost of hardware is a barrier to entry. Even when upfront cost is not a concern, the lead time to set up such an HPC environment takes months from acquiring hardware to set up the hardware with the right set of firmware, software installed and configured. Furthermore, scalability is hard to achieve in a rigid traditional lab environment. Therefore, it is slow to react to the dynamic change in the artificial intelligent industry. Watson Deep Learning as a service, a cloud-based deep learning platform that mitigates the long lead time and high upfront investment in hardware. It enables robust and scalable sharing of resources among the teams in an organization. It is designed for on-demand cloud environments. Providing a similar user experience in a multi-tenant cloud environment comes with its own unique challenges regarding fault tolerance, performance, and security. Watson Deep Learning as a service tackles these challenges and present a deep learning stack for the cloud environments in a secure, scalable and fault-tolerant manner. It supports a wide range of deep-learning frameworks such as Tensorflow, PyTorch, Caffe, Torch, Theano, and MXNet etc. These frameworks reduce the effort and skillset required to design, train, and use deep learning models. Deep Learning as a service is used at IBM by AI researchers in areas including machine translation, computer vision, and healthcare.Keywords: deep learning, machine learning, cognitive computing, model training
Procedia PDF Downloads 21124155 A Cross-Sectional Study Assessing Communication Practices among Doctors at a University Hospital in Pakistan
Authors: Muhammad Waqas Baqai, Noman Shahzad, Rehman Alvi
Abstract:
Communication among health care givers is the essence of quality patient care and any compromise results in errors and inefficiency leading to cumbersome outcomes. The use of smartphone among health professionals has increased tremendously. Almost every health professional carries it and majority of them uses a third party communication software called whatsApp for work related communications. It gives instant access to the person responsible for any particular query and therefore helps in efficient and timely decision making. It is also an easy way of sharing medical documents, multimedia and provides platform for consensual decision making through group discussions. However clinical communication through whatsApp has some demerits too including reduction in verbal communication, worsening professional relations, unprofessional behavior, risk of confidentiality breach and threats from cyber-attacks. On the other hand the traditional pager device being used in many health care systems is a unidirectional communication that lacks the ability to convey any information other than the number to which the receiver has to respond. Our study focused on these two widely used modalities of communication among doctors of the largest tertiary care center of Pakistan i.e. The Aga Khan University Hospital. Our aim was to note which modality is considered better and has fewer threats to medical data. Approval from ethical review committee of the institute was taken prior to conduction of this study. We submitted an online survey form to all the interns and residents working at our institute and collected their response in a month’s time. 162 submissions were recorded and analyzed using descriptive statistics. Only 20% of them were comfortable with using pagers exclusively, 52% with whatsApp and 28% with both. 65% think that whatsApp is time-saving and quicker than pager. 54% of them considered whatsApp to be causing nuisance from work related notifications in their off-work hours. 60% think that they are more likely to miss information through pager system because of the unidirectional nature. Almost all (96%) of residents and interns found whatsApp to be useful in terms of saving information for future reference. For urgent issues, majority (70%) preferred pager over whatsApp and also pager was considered more valid in terms of hospital policies and legal issues. Among major advantages of whatsApp as listed by them were; easy mass communication, sharing of clinical pictures, universal access and no need of carrying additional device. However the major drawback of using whatsApp for clinical communication that everyone shared was threat to patients’ confidentiality as clinicians usually share pictures of wounds, clinical documents etc. Lastly we asked them if they think there is a need of a separate application for instant communication dedicated to clinical communication only and 90% responded positively. Therefore, we concluded that both modalities have their merits and demerits but the greatest drawback with whatsApp is the risk of breach in patients’ confidentiality and off-work disturbance. Hence, we recommend a more secure, institute-run application for all intra hospital communications where they can share documents, pictures etc. easily under a controlled environment.Keywords: WhatsApp, pager, clinical communication, confidentiality
Procedia PDF Downloads 14924154 Prediction of Anticancer Potential of Curcumin Nanoparticles by Means of Quasi-Qsar Analysis Using Monte Carlo Method
Authors: Ruchika Goyal, Ashwani Kumar, Sandeep Jain
Abstract:
The experimental data for anticancer potential of curcumin nanoparticles was calculated by means of eclectic data. The optimal descriptors were examined using Monte Carlo method based CORAL SEA software. The statistical quality of the model is following: n = 14, R² = 0.6809, Q² = 0.5943, s = 0.175, MAE = 0.114, F = 26 (sub-training set), n =5, R²= 0.9529, Q² = 0.7982, s = 0.086, MAE = 0.068, F = 61, Av Rm² = 0.7601, ∆R²m = 0.0840, k = 0.9856 and kk = 1.0146 (test set) and n = 5, R² = 0.6075 (validation set). This data can be used to build predictive QSAR models for anticancer activity.Keywords: anticancer potential, curcumin, model, nanoparticles, optimal descriptors, QSAR
Procedia PDF Downloads 32224153 Static vs. Stream Mining Trajectories Similarity Measures
Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh
Abstract:
Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining
Procedia PDF Downloads 397