Search results for: missing data estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26530

Search results for: missing data estimation

23770 "Revolutionizing Geographic Data: CADmapper's Automated Precision in CAD Drawing Transformation"

Authors: Toleen Alaqqad, Kadi Alshabramiy, Suad Zaafarany, Basma Musallam

Abstract:

CADmapper is a significant tool of software for transforming geographic data into realistic CAD drawings. It speeds up and simplifies the conversion process by automating it. This allows architects, urban planners, engineers, and geographic information system (GIS) experts to solely concentrate on the imaginative and scientific parts of their projects. While the future incorporation of AI has the potential for further improvements, CADmapper's current capabilities make it an indispensable asset in the business. It covers a combination of 2D and 3D city and urban area models. The user can select a specific square section of the map to view, and the fee is based on the dimensions of the area being viewed. The procedure is straightforward: you choose the area you want, then pick whether or not to include topography. 3D architectural data (if available), followed by selecting whatever design program or CAD style you want to publish the document which contains more than 200 free broad town plans in DXF format. If you desire to specify a bespoke area, it's free up to 1 km2.

Keywords: cadmaper, gdata, 2d and 3d data conversion, automated cad drawing, urban planning software

Procedia PDF Downloads 69
23769 Estimation of Implicit Colebrook White Equation by Preferable Explicit Approximations in the Practical Turbulent Pipe Flow

Authors: Itissam Abuiziah

Abstract:

In several hydraulic systems, it is necessary to calculate the head losses which depend on the resistance flow friction factor in Darcy equation. Computing the resistance friction is based on implicit Colebrook-White equation which is considered as the standard for the friction calculation, but it needs high computational cost, therefore; several explicit approximation methods are used for solving an implicit equation to overcome this issue. It follows that the relative error is used to determine the most accurate method among the approximated used ones. Steel, cast iron and polyethylene pipe materials investigated with practical diameters ranged from 0.1m to 2.5m and velocities between 0.6m/s to 3m/s. In short, the results obtained show that the suitable method for some cases may not be accurate for other cases. For example, when using steel pipe materials, Zigrang and Silvester's method has revealed as the most precise in terms of low velocities 0.6 m/s to 1.3m/s. Comparatively, Halland method showed a less relative error with the gradual increase in velocity. Accordingly, the simulation results of this study might be employed by the hydraulic engineers, so they can take advantage to decide which is the most applicable method according to their practical pipe system expectations.

Keywords: Colebrook–White, explicit equation, friction factor, hydraulic resistance, implicit equation, Reynolds numbers

Procedia PDF Downloads 189
23768 An IoT-Enabled Crop Recommendation System Utilizing Message Queuing Telemetry Transport (MQTT) for Efficient Data Transmission to AI/ML Models

Authors: Prashansa Singh, Rohit Bajaj, Manjot Kaur

Abstract:

In the modern agricultural landscape, precision farming has emerged as a pivotal strategy for enhancing crop yield and optimizing resource utilization. This paper introduces an innovative Crop Recommendation System (CRS) that leverages the Internet of Things (IoT) technology and the Message Queuing Telemetry Transport (MQTT) protocol to collect critical environmental and soil data via sensors deployed across agricultural fields. The system is designed to address the challenges of real-time data acquisition, efficient data transmission, and dynamic crop recommendation through the application of advanced Artificial Intelligence (AI) and Machine Learning (ML) models. The CRS architecture encompasses a network of sensors that continuously monitor environmental parameters such as temperature, humidity, soil moisture, and nutrient levels. This sensor data is then transmitted to a central MQTT server, ensuring reliable and low-latency communication even in bandwidth-constrained scenarios typical of rural agricultural settings. Upon reaching the server, the data is processed and analyzed by AI/ML models trained to correlate specific environmental conditions with optimal crop choices and cultivation practices. These models consider historical crop performance data, current agricultural research, and real-time field conditions to generate tailored crop recommendations. This implementation gets 99% accuracy.

Keywords: Iot, MQTT protocol, machine learning, sensor, publish, subscriber, agriculture, humidity

Procedia PDF Downloads 70
23767 A Conversational Chatbot for Cricket Analytics

Authors: Kishan Bharadwaj Shridhar

Abstract:

Cricket is a data-rich sport, generating vast amounts of information, much of which is captured as textual commentary. Leading cricket data providers, such as ESPN Cricinfo include valuable Decision Review System (DRS) statistics within these commentaries, often as footnotes. Despite the significance of this data, accessing and analyzing it efficiently remains a challenge. This paper presents the development of a sophisticated chatbot designed to answer queries specifically about DRS in cricket. It supports up to seven distinct query types, including individual player statistics, umpire performance, player vs umpire dynamics, comparisons between batter and bowler, a player’s record at specific venues and more. Additionally, it enables stateful conversations, allowing a user to seamlessly build upon previous queries for a fluid and interactive experience. Leveraging advanced text-to-SQL methodologies and open-source frameworks such as Langgraph, it ensures low latency and robust performance. A distinct prompt engineering module enables the system to accurately interpret query intent, dynamically transitioning to an assisted text-to-SQL approach or a rule-based engine, as needed. This solution is the one of its kind in cricket analytics, offering unparalleled insights in cricket through an intuitive interface. It can be extended to other facets of cricket data and beyond, to other sports that generate textual data.

Keywords: conversational AI, cricket data analytics, text to SQL, large language models, stateful conversations.

Procedia PDF Downloads 6
23766 Integration of Microarray Data into a Genome-Scale Metabolic Model to Study Flux Distribution after Gene Knockout

Authors: Mona Heydari, Ehsan Motamedian, Seyed Abbas Shojaosadati

Abstract:

Prediction of perturbations after genetic manipulation (especially gene knockout) is one of the important challenges in systems biology. In this paper, a new algorithm is introduced that integrates microarray data into the metabolic model. The algorithm was used to study the change in the cell phenotype after knockout of Gss gene in Escherichia coli BW25113. Algorithm implementation indicated that gene deletion resulted in more activation of the metabolic network. Growth yield was more and less regulating gene were identified for mutant in comparison with the wild-type strain.

Keywords: metabolic network, gene knockout, flux balance analysis, microarray data, integration

Procedia PDF Downloads 579
23765 Towards Reliable Mobile Cloud Computing

Authors: Khaled Darwish, Islam El Madahh, Hoda Mohamed, Hadia El Hennawy

Abstract:

Cloud computing has been one of the fastest growing parts in IT industry mainly in the context of the future of the web where computing, communication, and storage services are main services provided for Internet users. Mobile Cloud Computing (MCC) is gaining stream which can be used to extend cloud computing functions, services and results to the world of future mobile applications and enables delivery of a large variety of cloud application to billions of smartphones and wearable devices. This paper describes reliability for MCC by determining the ability of a system or component to function correctly under stated conditions for a specified period of time to be able to deal with the estimation and management of high levels of lifetime engineering uncertainty and risks of failure. The assessment procedures consists of determine Mean Time between Failures (MTBF), Mean Time to Failure (MTTF), and availability percentages for main components in both cloud computing and MCC structures applied on single node OpenStack installation to analyze its performance with different settings governing the behavior of participants. Additionally, we presented several factors have a significant impact on rates of change overall cloud system reliability should be taken into account in order to deliver highly available cloud computing services for mobile consumers.

Keywords: cloud computing, mobile cloud computing, reliability, availability, OpenStack

Procedia PDF Downloads 399
23764 Extracting Opinions from Big Data of Indonesian Customer Reviews Using Hadoop MapReduce

Authors: Veronica S. Moertini, Vinsensius Kevin, Gede Karya

Abstract:

Customer reviews have been collected by many kinds of e-commerce websites selling products, services, hotel rooms, tickets and so on. Each website collects its own customer reviews. The reviews can be crawled, collected from those websites and stored as big data. Text analysis techniques can be used to analyze that data to produce summarized information, such as customer opinions. Then, these opinions can be published by independent service provider websites and used to help customers in choosing the most suitable products or services. As the opinions are analyzed from big data of reviews originated from many websites, it is expected that the results are more trusted and accurate. Indonesian customers write reviews in Indonesian language, which comes with its own structures and uniqueness. We found that most of the reviews are expressed with “daily language”, which is informal, do not follow the correct grammar, have many abbreviations and slangs or non-formal words. Hadoop is an emerging platform aimed for storing and analyzing big data in distributed systems. A Hadoop cluster consists of master and slave nodes/computers operated in a network. Hadoop comes with distributed file system (HDFS) and MapReduce framework for supporting parallel computation. However, MapReduce has weakness (i.e. inefficient) for iterative computations, specifically, the cost of reading/writing data (I/O cost) is high. Given this fact, we conclude that MapReduce function is best adapted for “one-pass” computation. In this research, we develop an efficient technique for extracting or mining opinions from big data of Indonesian reviews, which is based on MapReduce with one-pass computation. In designing the algorithm, we avoid iterative computation and instead adopt a “look up table” technique. The stages of the proposed technique are: (1) Crawling the data reviews from websites; (2) cleaning and finding root words from the raw reviews; (3) computing the frequency of the meaningful opinion words; (4) analyzing customers sentiments towards defined objects. The experiments for evaluating the performance of the technique were conducted on a Hadoop cluster with 14 slave nodes. The results show that the proposed technique (stage 2 to 4) discovers useful opinions, is capable of processing big data efficiently and scalable.

Keywords: big data analysis, Hadoop MapReduce, analyzing text data, mining Indonesian reviews

Procedia PDF Downloads 201
23763 Global City Typologies: 300 Cities and Over 100 Datasets

Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans

Abstract:

Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.

Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling

Procedia PDF Downloads 183
23762 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means

Procedia PDF Downloads 262
23761 Wear Measuring and Wear Modelling Based On Archard, ASTM, and Neural Network Models

Authors: A. Shebani, C. Pislaru

Abstract:

Wear of materials is an everyday experience and has been observed and studied for long time. The prediction of wear is a fundamental problem in the industrial field, mainly correlated to the planning of maintenance interventions and economy. Pin-on-disc test is the most common test which is used to study the wear behaviour. In this paper, the pin-on-disc (AEROTECH UNIDEX 11) is used for the investigation of the effects of normal load and hardness of material on the wear under dry and sliding conditions. In the pin-on-disc rig, two specimens were used; one, a pin which is made of steel with a tip, is positioned perpendicular to the disc, where the disc is made of aluminium. The pin wear and disc wear were measured by using the following instruments: The Talysurf instrument, a digital microscope, and the alicona instrument; where the Talysurf profilometer was used to measure the pin/disc wear scar depth, and the alicona was used to measure the volume loss for pin and disc. After that, the Archard model, American Society for Testing and Materials model (ASTM), and neural network model were used for pin/disc wear modelling and the simulation results are implemented by using the Matlab program. This paper focuses on how the alicona can be considered as a powerful tool for wear measurements and how the neural network is an effective algorithm for wear estimation.

Keywords: wear modelling, Archard Model, ASTM Model, Neural Networks Model, Pin-on-disc Test, Talysurf, digital microscope, Alicona

Procedia PDF Downloads 461
23760 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 43
23759 Quality Assurance for the Climate Data Store

Authors: Judith Klostermann, Miguel Segura, Wilma Jans, Dragana Bojovic, Isadora Christel Jimenez, Francisco Doblas-Reyees, Judit Snethlage

Abstract:

The Climate Data Store (CDS), developed by the Copernicus Climate Change Service (C3S) implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Union, is intended to become a key instrument for exploring climate data. The CDS contains both raw and processed data to provide information to the users about the past, present and future climate of the earth. It allows for easy and free access to climate data and indicators, presenting an important asset for scientists and stakeholders on the path for achieving a more sustainable future. The C3S Evaluation and Quality Control (EQC) is assessing the quality of the CDS by undertaking a comprehensive user requirement assessment to measure the users’ satisfaction. Recommendations will be developed for the improvement and expansion of the CDS datasets and products. User requirements will be identified on the fitness of the datasets, the toolbox, and the overall CDS service. The EQC function of the CDS will help C3S to make the service more robust: integrated by validated data that follows high-quality standards while being user-friendly. This function will be closely developed with the users of the service. Through their feedback, suggestions, and contributions, the CDS can become more accessible and meet the requirements for a diverse range of users. Stakeholders and their active engagement are thus an important aspect of CDS development. This will be achieved with direct interactions with users such as meetings, interviews or workshops as well as different feedback mechanisms like surveys or helpdesk services at the CDS. The results provided by the users will be categorized as a function of CDS products so that their specific interests will be monitored and linked to the right product. Through this procedure, we will identify the requirements and criteria for data and products in order to build the correspondent recommendations for the improvement and expansion of the CDS datasets and products.

Keywords: climate data store, Copernicus, quality, user engagement

Procedia PDF Downloads 149
23758 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand-Side Management: A Systematic Mapping Review

Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring

Abstract:

An electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). Up to the authors' knowledge, there is no systematic mapping review focusing on the utilisation of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorising information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mixing method is much lower than the other techniques, and the proportion of Real-time data (RTD) to non-real-time data (NRTD) is about equal.

Keywords: demand side management, direct load control, electric water heater, indirect load control, non real-time data, real-time data

Procedia PDF Downloads 83
23757 Nafion Multiwalled Carbon Nano Tubes Composite Film Modified Glassy Carbon Sensor for the Voltammetric Estimation of Dianabol Steroid in Pharmaceuticals and Biological Fluids

Authors: Nouf M. Al-Ourfi, A. S. Bashammakh, M. S. El-Shahawi

Abstract:

The redox behavior of dianabol steroid (DS) on Nafion Multiwalled Carbon nano -tubes (MWCNT) composite film modified glassy carbon electrode (GCE) in various buffer solutions was studied using cyclic voltammetry (CV) and differential pulse- adsorptive cathodic stripping voltammetry (DP-CSV) and successfully compared with the results at non modified bare GCE. The Nafion-MWCNT composite film modified GCE exhibited the best electrochemical response among the two electrodes for the electro reduction of DS that was inferred from the EIS, CV and DP-CSV. The modified sensor showed a sensitive, stable and linear response in the concentration range of 5 – 100 nM with a detection limit of 0.08 nM. The selectivity of the proposed sensor was assessed in the presence of high concentration of major interfering species. The analytical application of the sensor for the quantification of DS in pharmaceutical formulations and biological fluids (urine) was determined and the results demonstrated acceptable recovery and RSD of 5%. Statistical treatment of the results of the proposed method revealed no significant differences in the accuracy and precision. The relative standard deviations for five measurements of 50 and 300 ng mL−1 of DS were 3.9 % and 1.0 %, respectively.

Keywords: dianabol steroid, determination, modified GCE, urine

Procedia PDF Downloads 284
23756 Implications of Circular Economy on Users Data Privacy: A Case Study on Android Smartphones Second-Hand Market

Authors: Mariia Khramova, Sergio Martinez, Duc Nguyen

Abstract:

Modern electronic devices, particularly smartphones, are characterised by extremely high environmental footprint and short product lifecycle. Every year manufacturers release new models with even more superior performance, which pushes the customers towards new purchases. As a result, millions of devices are being accumulated in the urban mine. To tackle these challenges the concept of circular economy has been introduced to promote repair, reuse and recycle of electronics. In this case, electronic devices, that previously ended up in landfills or households, are getting the second life, therefore, reducing the demand for new raw materials. Smartphone reuse is gradually gaining wider adoption partly due to the price increase of flagship models, consequently, boosting circular economy implementation. However, along with reuse of communication device, circular economy approach needs to ensure the data of the previous user have not been 'reused' together with a device. This is especially important since modern smartphones are comparable with computers in terms of performance and amount of data stored. These data vary from pictures, videos, call logs to social security numbers, passport and credit card details, from personal information to corporate confidential data. To assess how well the data privacy requirements are followed on smartphones second-hand market, a sample of 100 Android smartphones has been purchased from IT Asset Disposition (ITAD) facilities responsible for data erasure and resell. Although devices should not have stored any user data by the time they leave ITAD, it has been possible to retrieve the data from 19% of the sample. Applied techniques varied from manual device inspection to sophisticated equipment and tools. These findings indicate significant barrier in implementation of circular economy and a limitation of smartphone reuse. Therefore, in order to motivate the users to donate or sell their old devices and make electronic use more sustainable, data privacy on second-hand smartphone market should be significantly improved. Presented research has been carried out in the framework of sustainablySMART project, which is part of Horizon 2020 EU Framework Programme for Research and Innovation.

Keywords: android, circular economy, data privacy, second-hand phones

Procedia PDF Downloads 129
23755 Development of Muay Thai Competition Management for Promoting Sport Tourism in the next Decade (2015-2024)

Authors: Supasak Ngaoprasertwong

Abstract:

The purpose of this research was to develop a model for Muay Thai competition management for promoting sport tourism in the next decade. Moreover, the model was appropriately initiated for practical use. This study also combined several methodologies, both quantitative research and qualitative research, to entirely cover all aspects of data, especially the tourists’ satisfaction toward Muay Thai competition. The data were collected from 400 tourists watching Muay Thai competition in 4 stadiums to create the model for Muay Thai competition to support the sport tourism in the next decade. Besides, Ethnographic Delphi Futures Research (EDFR) was applied to gather the data from certain experts in boxing industry or having significant role in Muay Thai competition in both public sector and private sector. The first step of data collection was an in-depth interview with 27 experts associated with Muay Thai competition, Muay Thai management, and tourism. The second step and the third step of data collection were conducted to confirm the experts’ opinions toward various elements. When the 3 steps of data collection were completely accomplished, all data were assembled to draft the model. Then the model was proposed to 8 experts to conduct a brainstorming to affirm it. According to the results of quantitative research, it found that the tourists were satisfied with personnel of competition at high level (x=3.87), followed by facilities, services, and safe high level (x=3.67). Furthermore, they were satisfied with operation in competition field at high level (x=3.62).Regarding the qualitative methodology including literature review, theories, concepts and analysis of qualitative research development of the model for Muay Thai competition to promote the sport tourism in the next decade, the findings indicated that there were 2 data sets as follows: The first one was related to Muay Thai competition to encourage the sport tourism and the second one was associated with Muay Thai stadium management to support the sport tourism. After the brain storming, “EE Muay Thai Model” was finally developed for promoting the sport tourism in the next decade (2015-2024).

Keywords: Muay Thai competition management, Muay Thai sport tourism, Muay Thai, Muay Thai for sport tourism management

Procedia PDF Downloads 319
23754 Estimation Cytokines IL-2, IL-4, IL-8 in Serum and Nasal Secretions of Patients with Various Forms of Chronic Polypoid Rhinosinusitis

Authors: U. N. Vokhidov, U. S. Khasanov, A. A. Ismailova

Abstract:

Background: Currently, the researches on the development of chronic polypoid rhinosinusitis cytokines play a major role. The aim of this study was the comparison of indicators IL-2, IL-4, IL-8 in the peripheral blood and nasal secretions of patients with various forms of chronic polypoid rhinosinusitis. Material and methods: We studied 50 patients with chronic polypoid rhinosinusitis receiving hospital treatment in the ENT department of the 3-rd clinic of Tashkent Medical Academy. It was carried out a comprehensive study including morphological examination, immunological study of blood and nasal secretions on the IL-2, IL-4 and IL-8. Results: The results of immunological studies of peripheral blood showed that patients with ‘eosinophilic’ polyps were increased IL-2 and IL-4 in patients with ‘neutrophils’ polyps were increased IL-2 and IL-8. Immunological investigation nasal secretions taken from patients with nasal polyposis rhinosinusitis showed that patients with ‘eosinophilic’ polyps also increased IL- 2 and IL- 4 in patients with ‘neutrophils’ polyps - increased IL-2 and IL-8. Conclusion: In patients with ‘eosinophilic’ polyps revealed the presence of immunity to the allergy of the body, patients with ‘neutrophilic’ polyps identified immunity to the presence of inflammation, it is necessary to take into account the doctor-otolaryngologist when choosing a treatment strategy for the prevention of recurrence of the disease.

Keywords: chronic polypoid rhinosinusitis, immunology, cytikines, nasal secretion

Procedia PDF Downloads 222
23753 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 472
23752 Dynamic Analysis of Commodity Price Fluctuation and Fiscal Management in Sub-Saharan Africa

Authors: Abidemi C. Adegboye, Nosakhare Ikponmwosa, Rogers A. Akinsokeji

Abstract:

For many resource-rich developing countries, fiscal policy has become a key tool used for short-run fiscal management since it is considered as playing a critical role in injecting part of resource rents into the economies. However, given its instability, reliance on revenue from commodity exports renders fiscal management, budgetary planning and the efficient use of public resources difficult. In this study, the linkage between commodity prices and fiscal operations among a sample of commodity-exporting countries in sub-Saharan Africa (SSA) is investigated. The main question is whether commodity price fluctuations affects the effectiveness of fiscal policy as a macroeconomic stabilization tool in these countries. Fiscal management effectiveness is considered as the ability of fiscal policy to react countercyclically to output gaps in the economy. Fiscal policy is measured as the ratio of fiscal deficit to GDP and the ratio of government spending to GDP, output gap is measured as a Hodrick-Prescott filter of output growth for each country, while commodity prices are associated with each country based on its main export commodity. Given the dynamic nature of fiscal policy effects on the economy overtime, a dynamic framework is devised for the empirical analysis. The panel cointegration and error correction methodology is used to explain the relationships. In particular, the study employs the panel ECM technique to trace short-term effects of commodity prices on fiscal management and also uses the fully modified OLS (FMOLS) technique to determine the long run relationships. These procedures provide sufficient estimation of the dynamic effects of commodity prices on fiscal policy. Data used cover the period 1992 to 2016 for 11 SSA countries. The study finds that the elasticity of the fiscal policy measures with respect to the output gap is significant and positive, suggesting that fiscal policy is actually procyclical among the countries in the sample. This implies that fiscal management for these countries follows the trend of economic performance. Moreover, it is found that fiscal policy has not performed well in delivering macroeconomic stabilization for these countries. The difficulty in applying fiscal stabilization measures is attributable to the unstable revenue inflows due to the highly volatile nature of commodity prices in the international market. For commodity-exporting countries in SSA to improve fiscal management, therefore, fiscal planning should be largely decoupled from commodity revenues, domestic revenue bases must be improved, and longer period perspectives in fiscal policy management are the critical suggestions in this study.

Keywords: commodity prices, ECM, fiscal policy, fiscal procyclicality, fully modified OLS, sub-saharan africa

Procedia PDF Downloads 166
23751 LiDAR Based Real Time Multiple Vehicle Detection and Tracking

Authors: Zhongzhen Luo, Saeid Habibi, Martin v. Mohrenschildt

Abstract:

Self-driving vehicle require a high level of situational awareness in order to maneuver safely when driving in real world condition. This paper presents a LiDAR based real time perception system that is able to process sensor raw data for multiple target detection and tracking in dynamic environment. The proposed algorithm is nonparametric and deterministic that is no assumptions and priori knowledge are needed from the input data and no initializations are required. Additionally, the proposed method is working on the three-dimensional data directly generated by LiDAR while not scarifying the rich information contained in the domain of 3D. Moreover, a fast and efficient for real time clustering algorithm is applied based on a radially bounded nearest neighbor (RBNN). Hungarian algorithm procedure and adaptive Kalman filtering are used for data association and tracking algorithm. The proposed algorithm is able to run in real time with average run time of 70ms per frame.

Keywords: lidar, segmentation, clustering, tracking

Procedia PDF Downloads 426
23750 Burden of Communicable and Non-Communicable Disease in India: A Regional Analysis

Authors: Ajit Kumar Yadav, Priyanka Yadav, F. Ram

Abstract:

In present study is an effort to analyse the burden of diseases in the state. Disability Adjusted Life Years (DALY) is estimated non-communicable diseases. Multi-rounds (52nd, 60th and 71st round) of the National Sample Surveys (NSSO), conducted in 1995-96, 2004 and 2014 respectively, and Million Deaths Study (MDS) of 2001-03, 2006 and 2013-14 datasets are used. Descriptive and multivariate analyses are carried out to identify the determinants of different types of self-reported morbidity and DALY. The prevalence was higher for population aged 60 and above, among females, illiterates, and rich across the time period and for all the selected morbidities. The results were found to be significant at P<0.001. The estimation of DALY revealed that, the burden of communicable diseases was higher during infancy, noticeably among males than females in 2002. However, females aged 1-5 years were more vulnerable to report communicable diseases than the corresponding males. The age distribution of DALY indicates that individuals aged below 5 years and above 60 year were more susceptible to ill health. The growing incidence of non-communicable diseases especially among the older generations put additional burden on the health system in the state. The state has to grapple with the unsettled preventable infectious diseases in one hand and growing non-communicable in other hand.

Keywords: disease burden, non-communicable, communicable, India and region

Procedia PDF Downloads 251
23749 On-line Control of the Natural and Anthropogenic Safety in Krasnoyarsk Region

Authors: T. Penkova, A. Korobko, V. Nicheporchuk, L. Nozhenkova, A. Metus

Abstract:

This paper presents an approach of on-line control of the state of technosphere and environment objects based on the integration of Data Warehouse, OLAP and Expert systems technologies. It looks at the structure and content of data warehouse that provides consolidation and storage of monitoring data. There is a description of OLAP-models that provide a multidimensional analysis of monitoring data and dynamic analysis of principal parameters of controlled objects. The authors suggest some criteria of emergency risk assessment using expert knowledge about danger levels. It is demonstrated now some of the proposed solutions could be adopted in territorial decision making support systems. Operational control allows authorities to detect threat, prevent natural and anthropogenic emergencies and ensure a comprehensive safety of territory.

Keywords: decision making support systems, emergency risk assessment, natural and anthropogenic safety, on-line control, territory

Procedia PDF Downloads 408
23748 Geomagnetic Jerks Observed in Geomagnetic Observatory Data Over Southern Africa Between 2017 and 2023

Authors: Sanele Lionel Khanyile, Emmanuel Nahayo

Abstract:

Geomagnetic jerks are jumps observed in the second derivative of the main magnetic field that occurs on annual to decadal timescales. Understanding these jerks is crucial as they provide valuable insights into the complex dynamics of the Earth’s outer liquid core. In this study, we investigate the occurrence of geomagnetic jerks in geomagnetic observatory data collected at southern African magnetic observatories, Hermanus (HER), Tsumeb (TSU), Hartebeesthoek (HBK) and Keetmanshoop (KMH) between 2017 and 2023. The observatory data was processed and analyzed by retaining quiet night-time data recorded during quiet geomagnetic activities with the help of Kp, Dst, and ring current RC indices. Results confirm the occurrence of the 2019-2020 geomagnetic jerk in the region and identify the recent 2021 jerk detected with V-shaped secular variation changes in X and Z components at all four observatories. The highest estimated 2021 jerk secular acceleration amplitudes in X and Z components were found at HBK, 12.7 nT/year² and 19. 1 nT/year², respectively. Notably, the global CHAOS-7 model aptly identifies this 2021 jerk in the Z component at all magnetic observatories in the region.

Keywords: geomagnetic jerks, secular variation, magnetic observatory data, South Atlantic Anomaly

Procedia PDF Downloads 81
23747 The Determinants of Country Corruption: Unobserved Heterogeneity and Individual Choice- An empirical Application with Finite Mixture Models

Authors: Alessandra Marcelletti, Giovanni Trovato

Abstract:

Corruption in public offices is found to be the reflection of country-specific features, however, the exact magnitude and the statistical significance of its determinants effect has not yet been identified. The paper aims to propose an estimation method to measure the impact of country fundamentals on corruption, showing that covariates could differently affect the extent of corruption across countries. Thus, we exploit a model able to take into account different factors affecting the incentive to ask or to be asked for a bribe, coherently with the use of the Corruption Perception Index. We assume that discordant results achieved in literature may be explained by omitted hidden factors affecting the agents' decision process. Moreover, assuming homogeneous covariates effect may lead to unreliable conclusions since the country-specific environment is not accounted for. We apply a Finite Mixture Model with concomitant variables to 129 countries from 1995 to 2006, accounting for the impact of the initial conditions in the socio-economic structure on the corruption patterns. Our findings confirm the hypothesis of the decision process of accepting or asking for a bribe varies with specific country fundamental features.

Keywords: Corruption, Finite Mixture Models, Concomitant Variables, Countries Classification

Procedia PDF Downloads 265
23746 Stability Indicating Method Development and Validation for Estimation of Antiasthmatic Drug in Combined Dosages Formed by RP-HPLC

Authors: Laxman H. Surwase, Lalit V. Sonawane, Bhagwat N. Poul

Abstract:

A simple stability indicating high performance liquid chromatographic method has been developed for the simultaneous determination of Levosalbutamol Sulphate and Ipratropium Bromide in bulk and pharmaceutical dosage form using reverse phase Zorbax Eclipse Plus C8 column (250mm×4.6mm), with mobile phase phosphate buffer (0.05M KH2PO4): acetonitrile (55:45v/v) pH 3.5 adjusted with ortho-phosphoric acid, the flow rate was 1.0 mL/min and the detection was carried at 212 nm. The retention times of Levosalbutamol Sulphate and Ipratropium Bromide were 2.2007 and 2.6611 min respectively. The correlation coefficient of Levosalbutamol Sulphate and Ipratropium Bromide was found to be 0.997 and 0.998.Calibration plots were linear over the concentration ranges 10-100µg/mL for both Levosalbutamol Sulphate and Ipratropium Bromide. The LOD and LOQ of Levosalbutamol Sulphate were 2.520µg/mL and 7.638µg/mL while for Ipratropium Bromide was 1.201µg/mL and 3.640 µg/mL. The accuracy of the proposed method was determined by recovery studies and found to be 100.15% for Levosalbutamol Sulphate and 100.19% for Ipratropium Bromide respectively. The method was validated for accuracy, linearity, sensitivity, precision, robustness, system suitability. The proposed method could be utilized for routine analysis of Levosalbutamol Sulphate and Ipratropium Bromide in bulk and pharmaceutical capsule dosage form.

Keywords: levosalbutamol sulphate, ipratropium bromide, RP-HPLC, phosphate buffer, acetonitrile

Procedia PDF Downloads 351
23745 Comparison Analysis on the Safety Culture between the Executives and the Operators: Case Study in the Aircraft Manufacturer in Taiwan

Authors: Wen-Chen Hwang, Yu-Hsi Yuan

Abstract:

According to the estimation made by researchers of safety and hygiene, 80% to 90% of workplace accidents in enterprises could be attributed to human factors. Nevertheless, human factors are not the only cause for accidents; instead, happening of accidents is also closely associated with the safety culture of the organization. Therefore, the most effective way of reducing accident rate would be to improve the social and the organizational factors that influence organization’s safety performance. Overview the present study is to understand the current level of safety culture in manufacturing enterprises. A tool for evaluating safety culture matching the needs and characteristics of manufacturing enterprises was developed by reviewing literature of safety culture, and taking the special backgrounds of the case enterprises into consideration. Expert validity was also implied for developing the questionnaire. Moreover, safety culture assessment was conducted through the practical investigation of the case enterprises. Total 505 samples were involved, 53 were executives and 452 were operators. The result of this study in comparison of the safety culture level between the executives and the operators was reached the significant level in 8 dimensions: Safety Commitment, Safety System, Safety Training, Safety Involvement, Reward and Motivation, Communication and Reporting, Leadership and Supervision, Learning and Changing. In general, the overall safety culture were executive level higher than operators level (M: 74.98 > 69.08; t=2.87; p < 0.01).

Keywords: questionnaire survey, safety culture, t-test, media studies

Procedia PDF Downloads 318
23744 Social Work Advocacy Regarding Equitable Hiring Of Latinos

Authors: Roberto Lorenzo

Abstract:

Much has been said about the dynamics of the Latin American experience in the United States, however, there seems to be very little data regarding the perception of career identity. Although we do have some Latinos within the professional ranks, there is not nearly enough to claim that we have practiced enough cultural competence to create equity in the professional sphere in the United States. In this thesis, data will be provided regarding labor force statistics highlighting the industries that Latin Americans frequent. Also provided will be the citing of data that suggests further necessity of cultural competence within the professional realm regarding Latin Americans. In addition, methods that were spoken about over the course of our social work education will be discussed in order to connect to possible solutions to this issue.

Keywords: hiring, Latinos, professional equity, cultural competence

Procedia PDF Downloads 26
23743 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 112
23742 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 94
23741 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome

Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco

Abstract:

Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.

Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index

Procedia PDF Downloads 136