Search results for: secure data aggregation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25924

Search results for: secure data aggregation

24094 A Conversational Chatbot for Cricket Analytics

Authors: Kishan Bharadwaj Shridhar

Abstract:

Cricket is a data-rich sport, generating vast amounts of information, much of which is captured as textual commentary. Leading cricket data providers, such as ESPN Cricinfo include valuable Decision Review System (DRS) statistics within these commentaries, often as footnotes. Despite the significance of this data, accessing and analyzing it efficiently remains a challenge. This paper presents the development of a sophisticated chatbot designed to answer queries specifically about DRS in cricket. It supports up to seven distinct query types, including individual player statistics, umpire performance, player vs umpire dynamics, comparisons between batter and bowler, a player’s record at specific venues and more. Additionally, it enables stateful conversations, allowing a user to seamlessly build upon previous queries for a fluid and interactive experience. Leveraging advanced text-to-SQL methodologies and open-source frameworks such as Langgraph, it ensures low latency and robust performance. A distinct prompt engineering module enables the system to accurately interpret query intent, dynamically transitioning to an assisted text-to-SQL approach or a rule-based engine, as needed. This solution is the one of its kind in cricket analytics, offering unparalleled insights in cricket through an intuitive interface. It can be extended to other facets of cricket data and beyond, to other sports that generate textual data.

Keywords: conversational AI, cricket data analytics, text to SQL, large language models, stateful conversations.

Procedia PDF Downloads 11
24093 Integration of Microarray Data into a Genome-Scale Metabolic Model to Study Flux Distribution after Gene Knockout

Authors: Mona Heydari, Ehsan Motamedian, Seyed Abbas Shojaosadati

Abstract:

Prediction of perturbations after genetic manipulation (especially gene knockout) is one of the important challenges in systems biology. In this paper, a new algorithm is introduced that integrates microarray data into the metabolic model. The algorithm was used to study the change in the cell phenotype after knockout of Gss gene in Escherichia coli BW25113. Algorithm implementation indicated that gene deletion resulted in more activation of the metabolic network. Growth yield was more and less regulating gene were identified for mutant in comparison with the wild-type strain.

Keywords: metabolic network, gene knockout, flux balance analysis, microarray data, integration

Procedia PDF Downloads 580
24092 Extracting Opinions from Big Data of Indonesian Customer Reviews Using Hadoop MapReduce

Authors: Veronica S. Moertini, Vinsensius Kevin, Gede Karya

Abstract:

Customer reviews have been collected by many kinds of e-commerce websites selling products, services, hotel rooms, tickets and so on. Each website collects its own customer reviews. The reviews can be crawled, collected from those websites and stored as big data. Text analysis techniques can be used to analyze that data to produce summarized information, such as customer opinions. Then, these opinions can be published by independent service provider websites and used to help customers in choosing the most suitable products or services. As the opinions are analyzed from big data of reviews originated from many websites, it is expected that the results are more trusted and accurate. Indonesian customers write reviews in Indonesian language, which comes with its own structures and uniqueness. We found that most of the reviews are expressed with “daily language”, which is informal, do not follow the correct grammar, have many abbreviations and slangs or non-formal words. Hadoop is an emerging platform aimed for storing and analyzing big data in distributed systems. A Hadoop cluster consists of master and slave nodes/computers operated in a network. Hadoop comes with distributed file system (HDFS) and MapReduce framework for supporting parallel computation. However, MapReduce has weakness (i.e. inefficient) for iterative computations, specifically, the cost of reading/writing data (I/O cost) is high. Given this fact, we conclude that MapReduce function is best adapted for “one-pass” computation. In this research, we develop an efficient technique for extracting or mining opinions from big data of Indonesian reviews, which is based on MapReduce with one-pass computation. In designing the algorithm, we avoid iterative computation and instead adopt a “look up table” technique. The stages of the proposed technique are: (1) Crawling the data reviews from websites; (2) cleaning and finding root words from the raw reviews; (3) computing the frequency of the meaningful opinion words; (4) analyzing customers sentiments towards defined objects. The experiments for evaluating the performance of the technique were conducted on a Hadoop cluster with 14 slave nodes. The results show that the proposed technique (stage 2 to 4) discovers useful opinions, is capable of processing big data efficiently and scalable.

Keywords: big data analysis, Hadoop MapReduce, analyzing text data, mining Indonesian reviews

Procedia PDF Downloads 201
24091 Global City Typologies: 300 Cities and Over 100 Datasets

Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans

Abstract:

Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.

Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling

Procedia PDF Downloads 184
24090 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means

Procedia PDF Downloads 264
24089 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 44
24088 Quality Assurance for the Climate Data Store

Authors: Judith Klostermann, Miguel Segura, Wilma Jans, Dragana Bojovic, Isadora Christel Jimenez, Francisco Doblas-Reyees, Judit Snethlage

Abstract:

The Climate Data Store (CDS), developed by the Copernicus Climate Change Service (C3S) implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Union, is intended to become a key instrument for exploring climate data. The CDS contains both raw and processed data to provide information to the users about the past, present and future climate of the earth. It allows for easy and free access to climate data and indicators, presenting an important asset for scientists and stakeholders on the path for achieving a more sustainable future. The C3S Evaluation and Quality Control (EQC) is assessing the quality of the CDS by undertaking a comprehensive user requirement assessment to measure the users’ satisfaction. Recommendations will be developed for the improvement and expansion of the CDS datasets and products. User requirements will be identified on the fitness of the datasets, the toolbox, and the overall CDS service. The EQC function of the CDS will help C3S to make the service more robust: integrated by validated data that follows high-quality standards while being user-friendly. This function will be closely developed with the users of the service. Through their feedback, suggestions, and contributions, the CDS can become more accessible and meet the requirements for a diverse range of users. Stakeholders and their active engagement are thus an important aspect of CDS development. This will be achieved with direct interactions with users such as meetings, interviews or workshops as well as different feedback mechanisms like surveys or helpdesk services at the CDS. The results provided by the users will be categorized as a function of CDS products so that their specific interests will be monitored and linked to the right product. Through this procedure, we will identify the requirements and criteria for data and products in order to build the correspondent recommendations for the improvement and expansion of the CDS datasets and products.

Keywords: climate data store, Copernicus, quality, user engagement

Procedia PDF Downloads 149
24087 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand-Side Management: A Systematic Mapping Review

Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring

Abstract:

An electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). Up to the authors' knowledge, there is no systematic mapping review focusing on the utilisation of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorising information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mixing method is much lower than the other techniques, and the proportion of Real-time data (RTD) to non-real-time data (NRTD) is about equal.

Keywords: demand side management, direct load control, electric water heater, indirect load control, non real-time data, real-time data

Procedia PDF Downloads 84
24086 Implications of Circular Economy on Users Data Privacy: A Case Study on Android Smartphones Second-Hand Market

Authors: Mariia Khramova, Sergio Martinez, Duc Nguyen

Abstract:

Modern electronic devices, particularly smartphones, are characterised by extremely high environmental footprint and short product lifecycle. Every year manufacturers release new models with even more superior performance, which pushes the customers towards new purchases. As a result, millions of devices are being accumulated in the urban mine. To tackle these challenges the concept of circular economy has been introduced to promote repair, reuse and recycle of electronics. In this case, electronic devices, that previously ended up in landfills or households, are getting the second life, therefore, reducing the demand for new raw materials. Smartphone reuse is gradually gaining wider adoption partly due to the price increase of flagship models, consequently, boosting circular economy implementation. However, along with reuse of communication device, circular economy approach needs to ensure the data of the previous user have not been 'reused' together with a device. This is especially important since modern smartphones are comparable with computers in terms of performance and amount of data stored. These data vary from pictures, videos, call logs to social security numbers, passport and credit card details, from personal information to corporate confidential data. To assess how well the data privacy requirements are followed on smartphones second-hand market, a sample of 100 Android smartphones has been purchased from IT Asset Disposition (ITAD) facilities responsible for data erasure and resell. Although devices should not have stored any user data by the time they leave ITAD, it has been possible to retrieve the data from 19% of the sample. Applied techniques varied from manual device inspection to sophisticated equipment and tools. These findings indicate significant barrier in implementation of circular economy and a limitation of smartphone reuse. Therefore, in order to motivate the users to donate or sell their old devices and make electronic use more sustainable, data privacy on second-hand smartphone market should be significantly improved. Presented research has been carried out in the framework of sustainablySMART project, which is part of Horizon 2020 EU Framework Programme for Research and Innovation.

Keywords: android, circular economy, data privacy, second-hand phones

Procedia PDF Downloads 130
24085 Development of Muay Thai Competition Management for Promoting Sport Tourism in the next Decade (2015-2024)

Authors: Supasak Ngaoprasertwong

Abstract:

The purpose of this research was to develop a model for Muay Thai competition management for promoting sport tourism in the next decade. Moreover, the model was appropriately initiated for practical use. This study also combined several methodologies, both quantitative research and qualitative research, to entirely cover all aspects of data, especially the tourists’ satisfaction toward Muay Thai competition. The data were collected from 400 tourists watching Muay Thai competition in 4 stadiums to create the model for Muay Thai competition to support the sport tourism in the next decade. Besides, Ethnographic Delphi Futures Research (EDFR) was applied to gather the data from certain experts in boxing industry or having significant role in Muay Thai competition in both public sector and private sector. The first step of data collection was an in-depth interview with 27 experts associated with Muay Thai competition, Muay Thai management, and tourism. The second step and the third step of data collection were conducted to confirm the experts’ opinions toward various elements. When the 3 steps of data collection were completely accomplished, all data were assembled to draft the model. Then the model was proposed to 8 experts to conduct a brainstorming to affirm it. According to the results of quantitative research, it found that the tourists were satisfied with personnel of competition at high level (x=3.87), followed by facilities, services, and safe high level (x=3.67). Furthermore, they were satisfied with operation in competition field at high level (x=3.62).Regarding the qualitative methodology including literature review, theories, concepts and analysis of qualitative research development of the model for Muay Thai competition to promote the sport tourism in the next decade, the findings indicated that there were 2 data sets as follows: The first one was related to Muay Thai competition to encourage the sport tourism and the second one was associated with Muay Thai stadium management to support the sport tourism. After the brain storming, “EE Muay Thai Model” was finally developed for promoting the sport tourism in the next decade (2015-2024).

Keywords: Muay Thai competition management, Muay Thai sport tourism, Muay Thai, Muay Thai for sport tourism management

Procedia PDF Downloads 319
24084 Application of Nanoparticles on Surface of Commercial Carbon-Based Adsorbent for Removal of Contaminants from Water

Authors: Ahmad Kayvani Fard, Gordon Mckay, Muataz Hussien

Abstract:

Adsorption/sorption is believed to be one of the optimal processes for the removal of heavy metals from water due to its low operational and capital cost as well as its high removal efficiency. Different materials have been reported in literature as adsorbent for heavy metal removal in waste water such as natural sorbents, organic polymers (synthetic) and mineral materials (inorganic). The selection of adsorbents and development of new functional materials that can achieve good removal of heavy metals from water is an important practice and depends on many factors, such as the availability of the material, cost of material, and material safety and etc. In this study we reported the synthesis of doped Activated carbon and Carbon nanotube (CNT) with different loading of metal oxide nanoparticles such as Fe2O3, Fe3O4, Al2O3, TiO2, SiO2 and Ag nanoparticles and their application in removal of heavy metals, hydrocarbon, and organics from waste water. Commercial AC and CNT with different loadings of mentioned nanoparticle were prepared and effect of pH, adsorbent dosage, sorption kinetic, and concentration effects are studied and optimum condition for removal of heavy metals from water is reported. The prepared composite sorbent is characterized using field emission scanning electron microscopy (FE-SEM), high transmission electron microscopy (HR-TEM), thermogravimetric analysis (TGA), X-ray diffractometer (XRD), the Brunauer, Emmett and Teller (BET) nitrogen adsorption technique, and Zeta potential. The composite materials showed higher removal efficiency and superior adsorption capacity compared to commercially available carbon based adsorbent. The specific surface area of AC increased by 50% reaching up to 2000 m2/g while the CNT specific surface area of CNT increased by more than 8 times reaching value of 890 m2/g. The increased surface area is one of the key parameters along with surface charge of the material determining the removal efficiency and removal efficiency. Moreover, the surface charge density of the impregnated CNT and AC have enhanced significantly where can benefit the adsorption process. The nanoparticles also enhance the catalytic activity of material and reduce the agglomeration and aggregation of material which provides more active site for adsorbing the contaminant from water. Some of the results for treating wastewater includes 100% removal of BTEX, arsenic, strontium, barium, phenolic compounds, and oil from water. The results obtained are promising for the use of AC and CNT loaded with metal oxide nanoparticle in treatment and pretreatment of waste water and produced water before desalination process. Adsorption can be very efficient with low energy consumption and economic feasibility.

Keywords: carbon nanotube, activated carbon, adsorption, heavy metal, water treatment

Procedia PDF Downloads 236
24083 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 472
24082 LiDAR Based Real Time Multiple Vehicle Detection and Tracking

Authors: Zhongzhen Luo, Saeid Habibi, Martin v. Mohrenschildt

Abstract:

Self-driving vehicle require a high level of situational awareness in order to maneuver safely when driving in real world condition. This paper presents a LiDAR based real time perception system that is able to process sensor raw data for multiple target detection and tracking in dynamic environment. The proposed algorithm is nonparametric and deterministic that is no assumptions and priori knowledge are needed from the input data and no initializations are required. Additionally, the proposed method is working on the three-dimensional data directly generated by LiDAR while not scarifying the rich information contained in the domain of 3D. Moreover, a fast and efficient for real time clustering algorithm is applied based on a radially bounded nearest neighbor (RBNN). Hungarian algorithm procedure and adaptive Kalman filtering are used for data association and tracking algorithm. The proposed algorithm is able to run in real time with average run time of 70ms per frame.

Keywords: lidar, segmentation, clustering, tracking

Procedia PDF Downloads 426
24081 On-line Control of the Natural and Anthropogenic Safety in Krasnoyarsk Region

Authors: T. Penkova, A. Korobko, V. Nicheporchuk, L. Nozhenkova, A. Metus

Abstract:

This paper presents an approach of on-line control of the state of technosphere and environment objects based on the integration of Data Warehouse, OLAP and Expert systems technologies. It looks at the structure and content of data warehouse that provides consolidation and storage of monitoring data. There is a description of OLAP-models that provide a multidimensional analysis of monitoring data and dynamic analysis of principal parameters of controlled objects. The authors suggest some criteria of emergency risk assessment using expert knowledge about danger levels. It is demonstrated now some of the proposed solutions could be adopted in territorial decision making support systems. Operational control allows authorities to detect threat, prevent natural and anthropogenic emergencies and ensure a comprehensive safety of territory.

Keywords: decision making support systems, emergency risk assessment, natural and anthropogenic safety, on-line control, territory

Procedia PDF Downloads 408
24080 Surface Defect-engineered Ceo₂−x by Ultrasound Treatment for Superior Photocatalytic H₂ Production and Water Treatment

Authors: Nabil Al-Zaqri

Abstract:

Semiconductor photocatalysts with surface defects display incredible light absorption bandwidth, and these defects function as highly active sites for oxidation processes by interacting with the surface band structure. Accordingly, engineering the photocatalyst with surface oxygen vacancies will enhance the semiconductor nanostructure's photocatalytic efficiency. Herein, a CeO2₋ₓ nanostructure is designed under the influence of low-frequency ultrasonic waves to create surface oxygen vacancies. This approach enhances the photocatalytic efficiency compared to many heterostructures while keeping the intrinsiccrystal structure intact. Ultrasonic waves induce the acoustic cavitation effect leading to the dissemination of active elements on the surface, which results in vacancy formation in conjunction with larger surface area and smaller particle size. The structural analysis of CeO₂₋ₓ revealed higher crystallinity, as well as morphological optimization, and the presence of oxygen vacancies is verified through Raman, X-rayphotoelectron spectroscopy, temperature-programmed reduction, photoluminescence, and electron spinresonance analyses. Oxygen vacancies accelerate the redox cycle between Ce₄+ and Ce₃+ by prolongingphotogenerated charge recombination. The ultrasound-treated pristine CeO₂ sample achieved excellenthydrogen production showing a quantum efficiency of 1.125% and efficient organic degradation. Ourpromising findings demonstrated that ultrasonic treatment causes the formation of surface oxygenvacancies and improves photocatalytic hydrogen evolution and pollution degradation. Conclusion: Defect engineering of the ceria nanoparticles with oxygen vacancies was achieved for the first time using low-frequency ultrasound treatment. The U-CeO₂₋ₓsample showed high crystallinity, and morphological changes were observed. Due to the acoustic cavitation effect, a larger surface area and small particle size were observed. The ultrasound treatment causes particle aggregation and surface defects leading to oxygen vacancy formation. The XPS, Raman spectroscopy, PL spectroscopy, and ESR results confirm the presence of oxygen vacancies. The ultrasound-treated sample was also examined for pollutant degradation, where 1O₂was found to be the major active species. Hence, the ultrasound treatment influences efficient photocatalysts for superior hydrogen evolution and an excellent photocatalytic degradation of contaminants. The prepared nanostructure showed excellent stability and recyclability. This work could pave the way for a unique post-synthesis strategy intended for efficient photocatalytic nanostructures.

Keywords: surface defect, CeO₂₋ₓ, photocatalytic, water treatment, H₂ production

Procedia PDF Downloads 143
24079 Geomagnetic Jerks Observed in Geomagnetic Observatory Data Over Southern Africa Between 2017 and 2023

Authors: Sanele Lionel Khanyile, Emmanuel Nahayo

Abstract:

Geomagnetic jerks are jumps observed in the second derivative of the main magnetic field that occurs on annual to decadal timescales. Understanding these jerks is crucial as they provide valuable insights into the complex dynamics of the Earth’s outer liquid core. In this study, we investigate the occurrence of geomagnetic jerks in geomagnetic observatory data collected at southern African magnetic observatories, Hermanus (HER), Tsumeb (TSU), Hartebeesthoek (HBK) and Keetmanshoop (KMH) between 2017 and 2023. The observatory data was processed and analyzed by retaining quiet night-time data recorded during quiet geomagnetic activities with the help of Kp, Dst, and ring current RC indices. Results confirm the occurrence of the 2019-2020 geomagnetic jerk in the region and identify the recent 2021 jerk detected with V-shaped secular variation changes in X and Z components at all four observatories. The highest estimated 2021 jerk secular acceleration amplitudes in X and Z components were found at HBK, 12.7 nT/year² and 19. 1 nT/year², respectively. Notably, the global CHAOS-7 model aptly identifies this 2021 jerk in the Z component at all magnetic observatories in the region.

Keywords: geomagnetic jerks, secular variation, magnetic observatory data, South Atlantic Anomaly

Procedia PDF Downloads 82
24078 Social Work Advocacy Regarding Equitable Hiring Of Latinos

Authors: Roberto Lorenzo

Abstract:

Much has been said about the dynamics of the Latin American experience in the United States, however, there seems to be very little data regarding the perception of career identity. Although we do have some Latinos within the professional ranks, there is not nearly enough to claim that we have practiced enough cultural competence to create equity in the professional sphere in the United States. In this thesis, data will be provided regarding labor force statistics highlighting the industries that Latin Americans frequent. Also provided will be the citing of data that suggests further necessity of cultural competence within the professional realm regarding Latin Americans. In addition, methods that were spoken about over the course of our social work education will be discussed in order to connect to possible solutions to this issue.

Keywords: hiring, Latinos, professional equity, cultural competence

Procedia PDF Downloads 27
24077 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 113
24076 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 95
24075 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome

Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco

Abstract:

Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.

Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index

Procedia PDF Downloads 137
24074 AI-Based Technologies in International Arbitration: An Exploratory Study on the Practicability of Applying AI Tools in International Arbitration

Authors: Annabelle Onyefulu-Kingston

Abstract:

One of the major purposes of AI today is to evaluate and analyze millions of micro and macro data in order to determine what is relevant in a particular case and proffer it in an adequate manner. Microdata, as far as it relates to AI in international arbitration, is the millions of key issues specifically mentioned by either one or both parties or by their counsels, arbitrators, or arbitral tribunals in arbitral proceedings. This can be qualifications of expert witness and admissibility of evidence, amongst others. Macro data, on the other hand, refers to data derived from the resolution of the dispute and, consequently, the final and binding award. A notable example of this includes the rationale of the award and specific and general damages awarded, amongst others. This paper aims to critically evaluate and analyze the possibility of technological inclusion in international arbitration. This research will be imploring the qualitative method by evaluating existing literature on the consequence of applying AI to both micro and macro data in international arbitration, and how this can be of assistance to parties, counsels, and arbitrators.

Keywords: AI-based technologies, algorithms, arbitrators, international arbitration

Procedia PDF Downloads 101
24073 A Virtual Grid Based Energy Efficient Data Gathering Scheme for Heterogeneous Sensor Networks

Authors: Siddhartha Chauhan, Nitin Kumar Kotania

Abstract:

Traditional Wireless Sensor Networks (WSNs) generally use static sinks to collect data from the sensor nodes via multiple forwarding. Therefore, network suffers with some problems like long message relay time, bottle neck problem which reduces the performance of the network. Many approaches have been proposed to prevent this problem with the help of mobile sink to collect the data from the sensor nodes, but these approaches still suffer from the buffer overflow problem due to limited memory size of sensor nodes. This paper proposes an energy efficient scheme for data gathering which overcomes the buffer overflow problem. The proposed scheme creates virtual grid structure of heterogeneous nodes. Scheme has been designed for sensor nodes having variable sensing rate. Every node finds out its buffer overflow time and on the basis of this cluster heads are elected. A controlled traversing approach is used by the proposed scheme in order to transmit data to sink. The effectiveness of the proposed scheme is verified by simulation.

Keywords: buffer overflow problem, mobile sink, virtual grid, wireless sensor networks

Procedia PDF Downloads 394
24072 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 209
24071 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 352
24070 The Impact of Motivation on Employee Performance in South Korea

Authors: Atabong Awung Lekeazem

Abstract:

The purpose of this paper is to identify the impact or role of incentives on employee’s performance with a particular emphasis on Korean workers. The process involves defining and explaining the different types of motivation. In defining them, we also bring out the difference between the two major types of motivations. The second phase of the paper shall involve gathering data/information from a sample population and then analyzing the data. In the analysis, we shall get to see the almost similar mentality or value which Koreans attach to motivation, which a slide different view coming only from top management personnel. The last phase shall have us presenting the data and coming to a conclusion from which possible knowledge on how managers and potential managers can ignite the best out of their employees.

Keywords: motivation, employee’s performance, Korean workers, business information systems

Procedia PDF Downloads 417
24069 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 308
24068 Comparison of Water Curing and Carbonation Curing on Mortar Mix Incorporating Cement Kiln Dust

Authors: Devender Sharma, Shweta Goyal

Abstract:

Sustainable development is a key to protect the environment for a secure future. Accelerated carbonation curing is a comparatively new technique for curing of concrete which involves sequestration of carbon dioxide gas into the precast concrete, resulting in improvement of the properties of concrete. This paper presents the results of a study to evaluate the effect of carbonation curing on cement mortars incorporating cement kiln dust (CKD) as partial replacement of cement. The mortar specimens were prepared by replacing cement with CKD in varying percentages of 0-50% by the weight of cement. The specimens were subjected to 12 hour carbonation curing, followed by sealed packing till testing age. The results were compared with the normal curing procedure, in which the specimens were water cured till the testing age. Compressive strength and microstructure of the mix were studied. It was noted that on increasing the percentage of CKD up to 10% by the weight of the cement, no considerable change was observed in the compressive strength. But as the percentage of CKD was further increased, there was a decrease in compressive strength, with strength decreasing up to 40% when 50% of the cement was replaced with CKD. The decrease in strength is due to the lesser lime content in CKD as compared to cement. High ettringite formation was observed in mixes with high percentages of CKD, thus indicating a decrease in the compressive strength. With carbonation curing, an early age strength gain was observed in mortars, even with higher percentages of CKD. The early strength of the carbonation cured mixes was found to be greater than water cured mixes irrespective of the percentage of CKD. 7 days and 28 days compressive strength of the mix was comparable for both the carbonation cured and water cured specimen. The increase in compressive strength can be attributed to the conversion of unstable Ca(OH)2 into stable CaCO3, which causes densification of the mix. CaCO3 precipitation and greater CSH gel formation was clearly observed in the SEM images of carbonation cured specimen, indicating higher compressive strength. Thus, carbonation curing can be used as an efficient method to enhance the properties of concrete.

Keywords: carbonation, cement kiln dust, compressive strength, microstructure

Procedia PDF Downloads 231
24067 Mapping of Geological Structures Using Aerial Photography

Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash

Abstract:

Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.

Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures

Procedia PDF Downloads 688
24066 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 131
24065 Ecological Engineering Through Organic Amendments: Enhancing Pest Regulation, Beneficial Insect Populations, and Rhizosphere Microbial Diversity in Cabbage Ecosystems

Authors: Ravi Prakash Maurya, Munaswamyreddygari Sreedhar

Abstract:

The present studies on ecological engineering through soil amendments in cabbage crops for insect pests regulation were conducted at G. B. Pant University of Agriculture and Technology, Pantnagar, Udham Singh Nagar, Uttarakhand, India. Ten treatments viz., Farm Yard Manure (FYM), Neem cake (NC), Vermicompost (VC), Poultry manure (PM), PM+FYM, NC+VC, NC+PM, VC+FYM, Urea+ SSP+MOP (Standard Check) and Untreated Check were evaluated to study the effect of these amendments on the population of insect pests, natural enemies and the microbial community of the rhizosphere in the cabbage crop ecosystem. The results revealed that most of the cabbage pests, viz., aphids, head borer, gram pod borer, and armyworm, were more prevalent in FYM, followed by PM and NC-treated plots. The best cost-benefit ratio was found in PM + FYM treatment, which was 1: 3.62, while the lowest, 1: 0.97, was found in the VC plot. The population of natural enemies like spiders, coccinellids, syrphids, and other hymenopterans and dipterans was also found to be prominent in organic plots, namely FYM, followed by VC and PM plots. Diversity studies on organic manure-treated plots were also carried out, which revealed a total of nine insect orders (Hymenoptera, Hemiptera, Lepidoptera, Coleoptera, Neuroptera, Diptera, Orthoptera, Dermaptera, Thysanoptera, and one arthropodan class, Arachnida) in different treatments. The Simpson Diversity Index was also studied and found to be maximum in FYM plots. The metagenomic analysis of the rhizosphere microbial community revealed that the highest bacterial count was found in NC+PM plot as compared to standard check and untreated check. The diverse microbial population contributes to soil aggregation and stability. Healthier soil structures can improve water retention, aeration, and root penetration, which are all crucial for crop health. The further analysis also identified a total of 39 bacterial phyla, among which the most abundant were Actinobacteria, Firmicutes, and the SAR324 clade. Actinobacteria and Firmicutes are known for their roles in decomposing organic matter and mineralizing nutrients. Their highest abundance suggests improved nutrient cycling and availability, which can directly enhance plant growth. Hence, organic amendments in cabbage farming can transform the rhizosphere microbiome, reduce pest pressure, and foster populations of beneficial insects, leading to healthier crops and a more sustainable agricultural ecosystem.

Keywords: cabbage ecosystem, organic amendments, rhizosphere microbiome, pest and natural enemy diversity

Procedia PDF Downloads 18