Search results for: link data
22800 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator
Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard
Abstract:
Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.Keywords: blade tip timing, blisk, finite element, vibration measurement
Procedia PDF Downloads 31422799 Blood Glucose Measurement and Analysis: Methodology
Authors: I. M. Abd Rahim, H. Abdul Rahim, R. Ghazali
Abstract:
There is numerous non-invasive blood glucose measurement technique developed by researchers, and near infrared (NIR) is the potential technique nowadays. However, there are some disagreements on the optimal wavelength range that is suitable to be used as the reference of the glucose substance in the blood. This paper focuses on the experimental data collection technique and also the analysis method used to analyze the data gained from the experiment. The selection of suitable linear and non-linear model structure is essential in prediction system, as the system developed need to be conceivably accurate.Keywords: linear, near-infrared (NIR), non-invasive, non-linear, prediction system
Procedia PDF Downloads 46122798 Seasonal Assessment of Snow Cover Dynamics Based on Aerospace Multispectral Data on Livingston Island, South Shetland Islands in Antarctica and on Svalbard in Arctic
Authors: Temenuzhka Spasova, Nadya Yanakieva
Abstract:
Snow modulates the hydrological cycle and influences the functioning of ecosystems and is a significant resource for many populations whose water is harvested from cold regions. Snow observations are important for validating climate models. The accumulation and rapid melt of snow are two of the most dynamical seasonal environmental changes on the Earth’s surface. The actuality of this research is related to the modern tendencies of the remote sensing application in the solution of problems of different nature in the ecological monitoring of the environment. The subject of the study is the dynamic during the different seasons on Livingstone Island, South Shetland Islands in Antarctica and on Svalbard in Arctic. The objects were analyzed and mapped according to the Еuropean Space Agency data (ESA), acquired by sensors Sentinel-1 SAR (Synthetic Aperture Radar), Sentinel 2 MSI and GIS. Results have been obtained for changes in snow coverage during the summer-winter transition and its dynamics in the two hemispheres. The data used is of high time-spatial resolution, which is an advantage when looking at the snow cover. The MSI images are with different spatial resolution at the Earth surface range. The changes of the environmental objects are shown with the SAR images and different processing approaches. The results clearly show that snow and snow melting can be best registered by using SAR data via hh- horizontal polarization. The effect of the researcher on aerospace data and technology enables us to obtain different digital models, structuring and analyzing results excluding the subjective factor. Because of the large extent of terrestrial snow coverage and the difficulties in obtaining ground measurements over cold regions, remote sensing and GIS represent an important tool for studying snow areas and properties from regional to global scales.Keywords: climate changes, GIS, remote sensing, SAR images, snow coverage
Procedia PDF Downloads 22122797 Disclosure of Financial Risk on Sharia Banks in Indonesia
Authors: Renny Wulandari
Abstract:
This study aims to determine how the influence of Non Performing Financing, Financing Deposit Ratio, Operating Expenses and Operating Revenue and Net Income Margin on the disclosure of financial risk in Sharia banks. To achieve these objectives conducted associative research method with data source in the form of secondary data that is annual report data with period 2013-2016. The population in this study is the sharia banking industry in Indonesia and who issued the annual financial statements. A method of sampling use probability sampling. Analysis in this research is with SEM-PLS. The result is Net Income Margin has a significant effect on financial risk disclosure while Non Performing Financing (NPF) Financing to Deposit Ratio (FDR), Operating Expenses and Operating Revenue (OEOR) have no effect on the disclosure of financial risk in sharia bank.Keywords: Sharia banks, disclosure of risk financial, non performing financing, financing deposit ratio, operating expenses and operating revenue, net income margin
Procedia PDF Downloads 23522796 Facial Behavior Modifications Following the Diffusion of the Use of Protective Masks Due to COVID-19
Authors: Andreas Aceranti, Simonetta Vernocchi, Marco Colorato, Daniel Zaccariello
Abstract:
Our study explores the usefulness of implementing facial expression recognition capabilities and using the Facial Action Coding System (FACS) in contexts where the other person is wearing a mask. In the communication process, the subjects use a plurality of distinct and autonomous reporting systems. Among them, the system of mimicking facial movements is worthy of attention. Basic emotion theorists have identified the existence of specific and universal patterns of facial expressions related to seven basic emotions -anger, disgust, contempt, fear, sadness, surprise, and happiness- that would distinguish one emotion from another. However, due to the COVID-19 pandemic, we have come up against the problem of having the lower half of the face covered and, therefore, not investigable due to the masks. Facial-emotional behavior is a good starting point for understanding: (1) the affective state (such as emotions), (2) cognitive activity (perplexity, concentration, boredom), (3) temperament and personality traits (hostility, sociability, shyness), (4) psychopathology (such as diagnostic information relevant to depression, mania, schizophrenia, and less severe disorders), (5) psychopathological processes that occur during social interactions patient and analyst. There are numerous methods to measure facial movements resulting from the action of muscles, see for example, the measurement of visible facial actions using coding systems (non-intrusive systems that require the presence of an observer who encodes and categorizes behaviors) and the measurement of electrical "discharges" of contracting muscles (facial electromyography; EMG). However, the measuring system invented by Ekman and Friesen (2002) - "Facial Action Coding System - FACS" is the most comprehensive, complete, and versatile. Our study, carried out on about 1,500 subjects over three years of work, allowed us to highlight how the movements of the hands and upper part of the face change depending on whether the subject wears a mask or not. We have been able to identify specific alterations to the subjects’ hand movement patterns and their upper face expressions while wearing masks compared to when not wearing them. We believe that finding correlations between how body language changes when our facial expressions are impaired can provide a better understanding of the link between the face and body non-verbal language.Keywords: facial action coding system, COVID-19, masks, facial analysis
Procedia PDF Downloads 8022795 Model Observability – A Monitoring Solution for Machine Learning Models
Authors: Amreth Chandrasehar
Abstract:
Machine Learning (ML) Models are developed and run in production to solve various use cases that help organizations to be more efficient and help drive the business. But this comes at a massive development cost and lost business opportunities. According to the Gartner report, 85% of data science projects fail, and one of the factors impacting this is not paying attention to Model Observability. Model Observability helps the developers and operators to pinpoint the model performance issues data drift and help identify root cause of issues. This paper focuses on providing insights into incorporating model observability in model development and operationalizing it in production.Keywords: model observability, monitoring, drift detection, ML observability platform
Procedia PDF Downloads 11422794 The Contribution of Sanitation Practices to Marine Pollution and the Prevalence of Water-Borne Diseases in Prampram Coastal Area, Greater Accra-Ghana
Authors: Precious Roselyn Obuobi
Abstract:
Background: In Ghana, water-borne diseases remain a public health concern due to its impact. While marine pollution has been linked to outbreak of diseases especially in communities along the coast, associated risks such as oil spillage, marine debris, erosion, improper waste disposal and management practices persist. Objective: The study seeks to investigate sanitation practices that contribute to marine pollution in Prampram and the prevalence of selected water-borne diseases (diarrhea and typhoid fever). Method: This study used a descriptive cross-sectional design, employing the mix-method (qualitative and quantitative) approach. Twenty-two (22) participants were selected and semistructured questionnaire were administered to them. Additionally, interviews were conducted to collect more information. Further, an observation check-list was used to aid the data collection process. Secondary data comprising information on water-borne diseases in the district was acquired from the district health directorate to determine the prevalence of selected water-borne diseases in the community. Data Analysis: The qualitative data was analyzed using NVIVO® software by adapting the six steps thematic analysis by Braun and Clarke whiles STATA® version 16 was used to analyze the secondary data collected from the district health directorate. A descriptive statistic employed using mean, standard deviation, frequencies and proportions were used to summarize the results. Results: The results showed that open defecation and indiscriminate waste disposal were the main practices contributing to marine pollution in Prampram and its effect on public health. Conclusion: These findings have implications on public health and the environment, thus effort needs to be stepped up in educating the community on best sanitation practices.Keywords: environment, sanitation, marine pollution, water-borne diseases
Procedia PDF Downloads 7722793 A Study on Vulnerability of Alahsa Governorate to Generate Urban Heat Islands
Authors: Ilham S. M. Elsayed
Abstract:
The purpose of this study is to investigate Alahsa Governorate status and its vulnerability to generate urban heat islands. Alahsa Governorate is a famous oasis in the Arabic Peninsula including several oil centers. Extensive literature review was done to collect previous relative data on the urban heat island of Alahsa Governorate. Data used for the purpose of this research were collected from authorized bodies who control weather station networks over Alahsa Governorate, Eastern Province, Saudi Arabia. Although, the number of weather station networks within the region is very limited and the analysis using GIS software and its techniques is difficult and limited, the data analyzed confirm an increase in temperature for more than 2 °C from 2004 to 2014. Such increase is considerable whenever human health and comfort are the concern. The increase of temperature within one decade confirms the availability of urban heat islands. The study concludes that, Alahsa Governorate is vulnerable to create urban heat islands and more attention should be drawn to strategic planning of the governorate that is developing with a high pace and considerable increasing levels of urbanization.Keywords: Alahsa Governorate, population density, Urban Heat Island, weather station
Procedia PDF Downloads 25422792 The Impact of Agricultural Product Export on Income and Employment in Thai Economy
Authors: Anucha Wittayakorn-Puripunpinyoo
Abstract:
The research objectives were 1) to study the situation and its trend of agricultural product export of Thailand 2) to study the impact of agricultural product export on income of Thai economy 3) the impact of agricultural product export on employment of Thai economy and 4) to find out the recommendations of agricultural product export policy of Thailand. In this research, secondary data were collected as yearly time series data from 1990 to 2016 accounted for 27 years. Data were collected from the Bank of Thailand database. Primary data were collected from the steakholders of agricultural product export policy of Thailand. Data analysis was applied descriptive statistics such as arithmetic mean, standard deviation. The forecasting of agricultural product was applied Mote Carlo Simulation technique as well as time trend analysis. In addition, the impact of agricultural product export on income and employment by applying econometric model while the estimated parameters were utilized the ordinary least square technique. The research results revealed that 1) agricultural product export value of Thailand from 1990 to 2016 was 338,959.5 Million Thai baht with its growth rate of 4.984 percent yearly, in addition, the forecasting of agricultural product export value of Thailand has increased but its growth rate has been declined 2) the impact of agricultural product export has positive impact on income in Thai economy, increasing in agricultural product export of Thailand by 1 percent would lead income increased by 0.0051 percent 3) the impact of agricultural product export has positive impact on employment in Thai economy, increasing in agricultural product export of Thailand by 1 percent would lead income increased by 0.079 percent and 4) in the future, agricultural product export policy would focused on finished or semi-finished agricultural product instead of raw material by applying technology and innovation in to make value added of agricultural product export. The public agricultural product export policy would support exporters in private sector in order to encourage them as agricultural exporters in Thailand.Keywords: agricultural product export, income, employment, Thai economy
Procedia PDF Downloads 31022791 Seafloor and Sea Surface Modelling in the East Coast Region of North America
Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk
Abstract:
Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.Keywords: seafloor, sea surface height, bathymetry, satellite altimetry
Procedia PDF Downloads 8222790 Mobi-DiQ: A Pervasive Sensing System for Delirium Risk Assessment in Intensive Care Unit
Authors: Subhash Nerella, Ziyuan Guan, Azra Bihorac, Parisa Rashidi
Abstract:
Intensive care units (ICUs) provide care to critically ill patients in severe and life-threatening conditions. However, patient monitoring in the ICU is limited by the time and resource constraints imposed on healthcare providers. Many critical care indices such as mobility are still manually assessed, which can be subjective, prone to human errors, and lack granularity. Other important aspects, such as environmental factors, are not monitored at all. For example, critically ill patients often experience circadian disruptions due to the absence of effective environmental “timekeepers” such as the light/dark cycle and the systemic effect of acute illness on chronobiologic markers. Although the occurrence of delirium is associated with circadian disruption risk factors, these factors are not routinely monitored in the ICU. Hence, there is a critical unmet need to develop systems for precise and real-time assessment through novel enabling technologies. We have developed the mobility and circadian disruption quantification system (Mobi-DiQ) by augmenting biomarker and clinical data with pervasive sensing data to generate mobility and circadian cues related to mobility, nightly disruptions, and light and noise exposure. We hypothesize that Mobi-DiQ can provide accurate mobility and circadian cues that correlate with bedside clinical mobility assessments and circadian biomarkers, ultimately important for delirium risk assessment and prevention. The collected multimodal dataset consists of depth images, Electromyography (EMG) data, patient extremity movement captured by accelerometers, ambient light levels, Sound Pressure Level (SPL), and indoor air quality measured by volatile organic compounds, and the equivalent CO₂ concentration. For delirium risk assessment, the system recognizes mobility cues (axial body movement features and body key points) and circadian cues, including nightly disruptions, ambient SPL, and light intensity, as well as other environmental factors such as indoor air quality. The Mobi-DiQ system consists of three major components: the pervasive sensing system, a data storage and analysis server, and a data annotation system. For data collection, six local pervasive sensing systems were deployed, including a local computer and sensors. A video recording tool with graphical user interface (GUI) developed in python was used to capture depth image frames for analyzing patient mobility. All sensor data is encrypted, then automatically uploaded to the Mobi-DiQ server through a secured VPN connection. Several data pipelines are developed to automate the data transfer, curation, and data preparation for annotation and model training. The data curation and post-processing are performed on the server. A custom secure annotation tool with GUI was developed to annotate depth activity data. The annotation tool is linked to the MongoDB database to record the data annotation and to provide summarization. Docker containers are also utilized to manage services and pipelines running on the server in an isolated manner. The processed clinical data and annotations are used to train and develop real-time pervasive sensing systems to augment clinical decision-making and promote targeted interventions. In the future, we intend to evaluate our system as a clinical implementation trial, as well as to refine and validate it by using other data sources, including neurological data obtained through continuous electroencephalography (EEG).Keywords: deep learning, delirium, healthcare, pervasive sensing
Procedia PDF Downloads 9422789 Delineation of the Geoelectric and Geovelocity Parameters in the Basement Complex of Northwestern Nigeria
Authors: M. D. Dogara, G. C. Afuwai, O. O. Esther, A. M. Dawai
Abstract:
The geology of Northern Nigeria is under intense investigation particularly that of the northwest believed to be of the basement complex. The variability of the lithology is consistently inconsistent. Hence, the need for a close range study, it is, in view of the above that, two geophysical techniques, the vertical electrical sounding employing the Schlumberger array and seismic refraction methods, were used to delineate the geoelectric and geovelocity parameters of the basement complex of northwestern Nigeria. A total area of 400,000 m² was covered with sixty geoelectric stations established and sixty sets of seismic refraction data collected using the forward and reverse method. From the interpretation of the resistivity data, it is suggestive that the area is underlain by not more than five geoelectric layers of varying thicknesses and resistivities when a maximum half electrode spread of 100m was used. The result of the interpreted seismic data revealed two geovelocity layers, with velocities ranging between 478m/s to 1666m/s for the first layer and 1166m/s to 7141m/s for the second layer. The results of the two techniques, suggests that the area of study has an undulating bedrock topography with geoeletric and geovelocity layers composed of weathered rock materials.Keywords: basement complex, delineation, geoelectric, geovelocity, Nigeria
Procedia PDF Downloads 15222788 The Thinking of Dynamic Formulation of Rock Aging Agent Driven by Data
Authors: Longlong Zhang, Xiaohua Zhu, Ping Zhao, Yu Wang
Abstract:
The construction of mines, railways, highways, water conservancy projects, etc., have formed a large number of high steep slope wounds in China. Under the premise of slope stability and safety, the minimum cost, green and close to natural wound space repair, has become a new problem. Nowadays, in situ element testing and analysis, monitoring, field quantitative factor classification, and assignment evaluation will produce vast amounts of data. Data processing and analysis will inevitably differentiate the morphology, mineral composition, physicochemical properties between rock wounds, by which to dynamically match the appropriate techniques and materials for restoration. In the present research, based on the grid partition of the slope surface, tested the content of the combined oxide of rock mineral (SiO₂, CaO, MgO, Al₂O₃, Fe₃O₄, etc.), and classified and assigned values to the hardness and breakage of rock texture. The data of essential factors are interpolated and normalized in GIS, which formed the differential zoning map of slope space. According to the physical and chemical properties and spatial morphology of rocks in different zones, organic acids (plant waste fruit, fruit residue, etc.), natural mineral powder (zeolite, apatite, kaolin, etc.), water-retaining agent, and plant gum (melon powder) were mixed in different proportions to form rock aging agents. To spray the aging agent with different formulas on the slopes in different sections can affectively age the fresh rock wound, providing convenience for seed implantation, and reducing the transformation of heavy metals in the rocks. Through many practical engineering practices, a dynamic data platform of rock aging agent formula system is formed, which provides materials for the restoration of different slopes. It will also provide a guideline for the mixed-use of various natural materials to solve the complex, non-uniformity ecological restoration problem.Keywords: data-driven, dynamic state, high steep slope, rock aging agent, wounds
Procedia PDF Downloads 11722787 Adult Language Learning in the Institute of Technology Sector in the Republic of Ireland
Authors: Una Carthy
Abstract:
A recent study of third level institutions in Ireland reveals that both age and aptitude can be overcome by teaching methodologies to motivate second language learners. This PhD investigation gathered quantitative and qualitative data from 14 Institutes of Technology over a three years period from 2011 to 2014. The fundamental research question was to establish the impact of institutional language policy on attitudes towards language learning. However, other related issues around second language acquisition arose in the course of the investigation. Data were collected from both lectures and students, allowing interesting points of comparison to emerge from both datasets. Negative perceptions among lecturers regarding language provision were often associated with the view that language learning belongs to primary and secondary level and has no place in third level education. This perception was offset by substantial data showing positive attitudes towards adult language learning. Lenneberg’s Critical Age Theory postulated that the optimum age for learning a second language is before puberty. More recently, scholars have challenged this theory in their studies, revealing that mature learners can and do succeed at learning languages. With regard to aptitude, a preoccupation among lecturers regarding poor literacy skills among students emerged and was often associated with resistance to second language acquisition. This was offset by a preponderance of qualitative data from students highlighting the crucial role which teaching approaches play in the learning process. Interestingly, the data collected regarding learning disabilities reveals that, given the appropriate learning environments, individuals can be motivated to acquire second languages, and indeed succeed at learning them. These findings are in keeping with other recent studies regarding attitudes towards second language learning among students with learning disabilities. Both sets of findings reinforce the case for language policies in the Institute of Technology (IoTs). Supportive and positive learning environments can be created in third level institutions to motivate adult learners, thereby overcoming perceived obstacles relating to age and aptitude.Keywords: age, aptitude, second language acquisition, teaching methodologies
Procedia PDF Downloads 12422786 Integrating Deep Learning For Improved State Of Charge Estimation In Electric Bus
Authors: Ms. Hema Ramachandran, Dr. N. Vasudevan
Abstract:
Accurate estimation of the battery State of Charge (SOC) is essential for optimizing the range and performance of modern electric vehicles. This paper focuses on analysing historical driving data from electric buses, with an emphasis on feature extraction and data preprocessing of driving conditions. By selecting relevant parameters, a set of characteristic variables tailored to specific driving scenarios is established. A battery SOC prediction model based on a combination a bidirectional long short-term memory (LSTM) architecture and a standard fully connected neural network (FCNN) is then proposed, where various hyperparameters are identified and fine-tuned to enhance prediction accuracy. The results indicate that with optimized hyperparameters, the prediction achieves a Root Mean Square Error (RMSE) of 1.98% and a Mean Absolute Error (MAE) of 1.72%. This approach is expected to improve the efficiency of battery management systems and battery utilization in electric vehicles.Keywords: long short-term memory (lstm), battery health monitoring, data-driven models, battery charge-discharge cycles, adaptive soc estimation, voltage and current sensing
Procedia PDF Downloads 922785 Cloud Monitoring and Performance Optimization Ensuring High Availability
Authors: Inayat Ur Rehman, Georgia Sakellari
Abstract:
Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.Keywords: cloud computing, cloud monitoring, performance optimization, high availability, scalability, resource allocation, load balancing, auto-scaling, data security, data privacy
Procedia PDF Downloads 6122784 The Use of Artificial Intelligence to Curb Corruption in Brazil
Authors: Camila Penido Gomes
Abstract:
Over the past decade, an emerging body of research has been pointing to artificial intelligence´s great potential to improve the use of open data, increase transparency and curb corruption in the public sector. Nonetheless, studies on this subject are scant and usually lack evidence to validate AI-based technologies´ effectiveness in addressing corruption, especially in developing countries. Aiming to fill this void in the literature, this paper sets out to examine how AI has been deployed by civil society to improve the use of open data and prevent congresspeople from misusing public resources in Brazil. Building on the current debates and carrying out a systematic literature review and extensive document analyses, this research reveals that AI should not be deployed as one silver bullet to fight corruption. Instead, this technology is more powerful when adopted by a multidisciplinary team as a civic tool in conjunction with other strategies. This study makes considerable contributions, bringing to the forefront discussion a more accurate understanding of the factors that play a decisive role in the successful implementation of AI-based technologies in anti-corruption efforts.Keywords: artificial intelligence, civil society organization, corruption, open data, transparency
Procedia PDF Downloads 20622783 Performance Study of Classification Algorithms for Consumer Online Shopping Attitudes and Behavior Using Data Mining
Authors: Rana Alaa El-Deen Ahmed, M. Elemam Shehab, Shereen Morsy, Nermeen Mekawie
Abstract:
With the growing popularity and acceptance of e-commerce platforms, users face an ever increasing burden in actually choosing the right product from the large number of online offers. Thus, techniques for personalization and shopping guides are needed by users. For a pleasant and successful shopping experience, users need to know easily which products to buy with high confidence. Since selling a wide variety of products has become easier due to the popularity of online stores, online retailers are able to sell more products than a physical store. The disadvantage is that the customers might not find products they need. In this research the customer will be able to find the products he is searching for, because recommender systems are used in some ecommerce web sites. Recommender system learns from the information about customers and products and provides appropriate personalized recommendations to customers to find the needed product. In this paper eleven classification algorithms are comparatively tested to find the best classifier fit for consumer online shopping attitudes and behavior in the experimented dataset. The WEKA knowledge analysis tool, which is an open source data mining workbench software used in comparing conventional classifiers to get the best classifier was used in this research. In this research by using the data mining tool (WEKA) with the experimented classifiers the results show that decision table and filtered classifier gives the highest accuracy and the lowest accuracy classification via clustering and simple cart.Keywords: classification, data mining, machine learning, online shopping, WEKA
Procedia PDF Downloads 35322782 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach
Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak
Abstract:
Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity
Procedia PDF Downloads 16222781 Predicting Aggregation Propensity from Low-Temperature Conformational Fluctuations
Authors: Hamza Javar Magnier, Robin Curtis
Abstract:
There have been rapid advances in the upstream processing of protein therapeutics, which has shifted the bottleneck to downstream purification and formulation. Finding liquid formulations with shelf lives of up to two years is increasingly difficult for some of the newer therapeutics, which have been engineered for activity, but their formulations are often viscous, can phase separate, and have a high propensity for irreversible aggregation1. We explore means to develop improved predictive ability from a better understanding of how protein-protein interactions on formulation conditions (pH, ionic strength, buffer type, presence of excipients) and how these impact upon the initial steps in protein self-association and aggregation. In this work, we study the initial steps in the aggregation pathways using a minimal protein model based on square-well potentials and discontinuous molecular dynamics. The effect of model parameters, including range of interaction, stiffness, chain length, and chain sequence, implies that protein models fold according to various pathways. By reducing the range of interactions, the folding- and collapse- transition come together, and follow a single-step folding pathway from the denatured to the native state2. After parameterizing the model interaction-parameters, we developed an understanding of low-temperature conformational properties and fluctuations, and the correlation to the folding transition of proteins in isolation. The model fluctuations increase with temperature. We observe a low-temperature point, below which large fluctuations are frozen out. This implies that fluctuations at low-temperature can be correlated to the folding transition at the melting temperature. Because proteins “breath” at low temperatures, defining a native-state as a single structure with conserved contacts and a fixed three-dimensional structure is misleading. Rather, we introduce a new definition of a native-state ensemble based on our understanding of the core conservation, which takes into account the native fluctuations at low temperatures. This approach permits the study of a large range of length and time scales needed to link the molecular interactions to the macroscopically observed behaviour. In addition, these models studied are parameterized by fitting to experimentally observed protein-protein interactions characterized in terms of osmotic second virial coefficients.Keywords: protein folding, native-ensemble, conformational fluctuation, aggregation
Procedia PDF Downloads 36322780 Relational Attention Shift on Images Using Bu-Td Architecture and Sequential Structure Revealing
Authors: Alona Faktor
Abstract:
In this work, we present a NN-based computational model that can perform attention shifts according to high-level instruction. The instruction specifies the type of attentional shift using explicit geometrical relation. The instruction also can be of cognitive nature, specifying more complex human-human interaction or human-object interaction, or object-object interaction. Applying this approach sequentially allows obtaining a structural description of an image. A novel data-set of interacting humans and objects is constructed using a computer graphics engine. Using this data, we perform systematic research of relational segmentation shifts.Keywords: cognitive science, attentin, deep learning, generalization
Procedia PDF Downloads 20022779 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture
Authors: Sajjad Akbar, Rabia Bashir
Abstract:
With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.Keywords: agent based web content mining, content centric networking, information centric networking
Procedia PDF Downloads 47722778 One-Class Classification Approach Using Fukunaga-Koontz Transform and Selective Multiple Kernel Learning
Authors: Abdullah Bal
Abstract:
This paper presents a one-class classification (OCC) technique based on Fukunaga-Koontz Transform (FKT) for binary classification problems. The FKT is originally a powerful tool to feature selection and ordering for two-class problems. To utilize the standard FKT for data domain description problem (i.e., one-class classification), in this paper, a set of non-class samples which exist outside of positive class (target class) describing boundary formed with limited training data has been constructed synthetically. The tunnel-like decision boundary around upper and lower border of target class samples has been designed using statistical properties of feature vectors belonging to the training data. To capture higher order of statistics of data and increase discrimination ability, the proposed method, termed one-class FKT (OC-FKT), has been extended to its nonlinear version via kernel machines and referred as OC-KFKT for short. Multiple kernel learning (MKL) is a favorable family of machine learning such that tries to find an optimal combination of a set of sub-kernels to achieve a better result. However, the discriminative ability of some of the base kernels may be low and the OC-KFKT designed by this type of kernels leads to unsatisfactory classification performance. To address this problem, the quality of sub-kernels should be evaluated, and the weak kernels must be discarded before the final decision making process. MKL/OC-FKT and selective MKL/OC-FKT frameworks have been designed stimulated by ensemble learning (EL) to weight and then select the sub-classifiers using the discriminability and diversities measured by eigenvalue ratios. The eigenvalue ratios have been assessed based on their regions on the FKT subspaces. The comparative experiments, performed on various low and high dimensional data, against state-of-the-art algorithms confirm the effectiveness of our techniques, especially in case of small sample size (SSS) conditions.Keywords: ensemble methods, fukunaga-koontz transform, kernel-based methods, multiple kernel learning, one-class classification
Procedia PDF Downloads 2522777 A Simple Algorithm for Real-Time 3D Capturing of an Interior Scene Using a Linear Voxel Octree and a Floating Origin Camera
Authors: Vangelis Drosos, Dimitrios Tsoukalos, Dimitrios Tsolis
Abstract:
We present a simple algorithm for capturing a 3D scene (focused on the usage of mobile device cameras in the context of augmented/mixed reality) by using a floating origin camera solution and storing the resulting information in a linear voxel octree. Data is derived from cloud points captured by a mobile device camera. For the purposes of this paper, we assume a scene of fixed size (known to us or determined beforehand) and a fixed voxel resolution. The resulting data is stored in a linear voxel octree using a hashtable. We commence by briefly discussing the logic behind floating origin approaches and the usage of linear voxel octrees for efficient storage. Following that, we present the algorithm for translating captured feature points into voxel data in the context of a fixed origin world and storing them. Finally, we discuss potential applications and areas of future development and improvement to the efficiency of our solution.Keywords: voxel, octree, computer vision, XR, floating origin
Procedia PDF Downloads 13422776 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution
Authors: Masomeh Jamshid Nejad
Abstract:
Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.Keywords: statistics, excel-based instruction, data visualization, pedagogy
Procedia PDF Downloads 5622775 Novel Recommender Systems Using Hybrid CF and Social Network Information
Authors: Kyoung-Jae Kim
Abstract:
Collaborative Filtering (CF) is a popular technique for the personalization in the E-commerce domain to reduce information overload. In general, CF provides recommending items list based on other similar users’ preferences from the user-item matrix and predicts the focal user’s preference for particular items by using them. Many recommender systems in real-world use CF techniques because it’s excellent accuracy and robustness. However, it has some limitations including sparsity problems and complex dimensionality in a user-item matrix. In addition, traditional CF does not consider the emotional interaction between users. In this study, we propose recommender systems using social network and singular value decomposition (SVD) to alleviate some limitations. The purpose of this study is to reduce the dimensionality of data set using SVD and to improve the performance of CF by using emotional information from social network data of the focal user. In this study, we test the usability of hybrid CF, SVD and social network information model using the real-world data. The experimental results show that the proposed model outperforms conventional CF models.Keywords: recommender systems, collaborative filtering, social network information, singular value decomposition
Procedia PDF Downloads 29422774 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach
Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik
Abstract:
Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data
Procedia PDF Downloads 35222773 Delivering on Infrastructure Maintenance for Socio-Economic Growth: Exploration of South African Infrastructure for a Sustained Maintenance Strategy
Authors: Deenadayalan Govender
Abstract:
In South Africa, similar to nations globally, the prevailing tangible link between people and the state is public infrastructure. Services delivered through infrastructure to the people and to the state form a critical enabler for social development in communities and economic development in the country. In this regard, infrastructure, being the backbone to a nation’s prosperity, ideally should be effectively maintained for seamless delivery of services. South African infrastructure is in a state of deterioration, which is leading to infrastructure dysfunction and collapse and is negatively affecting development of the economy. This infrastructure deterioration stems from deficiencies in maintenance practices and strategies. From the birth of South African democracy, government has pursued socio-economic transformation and the delivery of critical basic services to decrease the broadening boundaries of disparity. In this regard, the National Infrastructure Plan borne from strategies encompassed in the National Development Plan is given priority by government in delivering strategic catalytic infrastructure projects. The National Infrastructure Plan is perceived to be the key in unlocking opportunities that generate economic growth, kerb joblessness, alleviate poverty, create new entrepreneurial prospects, and mitigate population expansion and rapid urbanisation. Socio-economic transformation benefits from new infrastructure spend is not being realised as initially anticipated. In this context, South Africa is currently in a state of weakening economic growth, with further amassed levels of joblessness, unremitting poverty and inequality. Due to investor reluctance, solicitation of strategic infrastructure funding is progressively becoming a debilitating challenge in all government institutions. Exacerbating these circumstances further, is substandard functionality of existing infrastructure subsequent to inadequate maintenance practices. This in-depth multi-sectoral study into the state of infrastructure is to understand the principal reasons for infrastructure functionality regression better; furthermore, prioritised investigations into progressive maintenance strategies is focused upon. Resultant recommendations reveal enhanced maintenance strategies, with a vision to capitalize on infrastructure design life, and also give special emphasis to socio-economic development imperatives in the long-term. The research method is principally based on descriptive methods (survey, historical, content analysis, qualitative).Keywords: infrastructure, maintenance, socio-economic, strategies
Procedia PDF Downloads 14322772 Text Emotion Recognition by Multi-Head Attention based Bidirectional LSTM Utilizing Multi-Level Classification
Authors: Vishwanath Pethri Kamath, Jayantha Gowda Sarapanahalli, Vishal Mishra, Siddhesh Balwant Bandgar
Abstract:
Recognition of emotional information is essential in any form of communication. Growing HCI (Human-Computer Interaction) in recent times indicates the importance of understanding of emotions expressed and becomes crucial for improving the system or the interaction itself. In this research work, textual data for emotion recognition is used. The text being the least expressive amongst the multimodal resources poses various challenges such as contextual information and also sequential nature of the language construction. In this research work, the proposal is made for a neural architecture to resolve not less than 8 emotions from textual data sources derived from multiple datasets using google pre-trained word2vec word embeddings and a Multi-head attention-based bidirectional LSTM model with a one-vs-all Multi-Level Classification. The emotions targeted in this research are Anger, Disgust, Fear, Guilt, Joy, Sadness, Shame, and Surprise. Textual data from multiple datasets were used for this research work such as ISEAR, Go Emotions, Affect datasets for creating the emotions’ dataset. Data samples overlap or conflicts were considered with careful preprocessing. Our results show a significant improvement with the modeling architecture and as good as 10 points improvement in recognizing some emotions.Keywords: text emotion recognition, bidirectional LSTM, multi-head attention, multi-level classification, google word2vec word embeddings
Procedia PDF Downloads 17622771 The Mass Attenuation Coefficients, Effective Atomic Cross Sections, Effective Atomic Numbers and Electron Densities of Some Halides
Authors: Shivalinge Gowda
Abstract:
The total mass attenuation coefficients m/r, of some halides such as, NaCl, KCl, CuCl, NaBr, KBr, RbCl, AgCl, NaI, KI, AgBr, CsI, HgCl2, CdI2 and HgI2 were determined at photon energies 279.2, 320.07, 514.0, 661.6, 1115.5, 1173.2 and 1332.5 keV in a well-collimated narrow beam good geometry set-up using a high resolution, hyper pure germanium detector. The mass attenuation coefficients and the effective atomic cross sections are found to be in good agreement with the XCOM values. From these mass attenuation coefficients, the effective atomic cross sections sa, of the compounds were determined. These effective atomic cross section sa data so obtained are then used to compute the effective atomic numbers Zeff. For this, the interpolation of total attenuation cross-sections of photons of energy E in elements of atomic number Z was performed by using the logarithmic regression analysis of the data measured by the authors and reported earlier for the above said energies along with XCOM data for standard energies. The best-fit coefficients in the photon energy range of 250 to 350 keV, 350 to 500 keV, 500 to 700 keV, 700 to 1000 keV and 1000 to 1500 keV by a piecewise interpolation method were then used to find the Zeff of the compounds with respect to the effective atomic cross section sa from the relation obtained by piece wise interpolation method. Using these Zeff values, the electron densities Nel of halides were also determined. The present Zeff and Nel values of halides are found to be in good agreement with the values calculated from XCOM data and other available published values.Keywords: mass attenuation coefficient, atomic cross-section, effective atomic number, electron density
Procedia PDF Downloads 378