Search results for: zonal metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 599

Search results for: zonal metrics

299 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 281
298 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 236
297 Comprehensive, Up-to-Date Climate System Change Indicators, Trends and Interactions

Authors: Peter Carter

Abstract:

Comprehensive climate change indicators and trends inform the state of the climate (system) with respect to present and future climate change scenarios and the urgency of mitigation and adaptation. With data records now going back for many decades, indicator trends can complement model projections. They are provided as datasets by several climate monitoring centers, reviewed by state of the climate reports, and documented by the IPCC assessments. Up-to-date indicators are provided here. Rates of change are instructive, as are extremes. The indicators include greenhouse gas (GHG) emissions (natural and synthetic), cumulative CO2 emissions, atmospheric GHG concentrations (including CO2 equivalent), stratospheric ozone, surface ozone, radiative forcing, global average temperature increase, land temperature increase, zonal temperature increases, carbon sinks, soil moisture, sea surface temperature, ocean heat content, ocean acidification, ocean oxygen, glacier mass, Arctic temperature, Arctic sea ice (extent and volume), northern hemisphere snow cover, permafrost indices, Arctic GHG emissions, ice sheet mass, sea level rise, and stratospheric and surface ozone. Global warming is not the most reliable single metric for the climate state. Radiative forcing, atmospheric CO2 equivalent, and ocean heat content are more reliable. Global warming does not provide future commitment, whereas atmospheric CO2 equivalent does. Cumulative carbon is used for estimating carbon budgets. The forcing of aerosols is briefly addressed. Indicator interactions are included. In particular, indicators can provide insight into several crucial global warming amplifying feedback loops, which are explained. All indicators are increasing (adversely), most as fast as ever and some faster. One particularly pressing indicator is rapidly increasing global atmospheric methane. In this respect, methane emissions and sources are covered in more detail. In their application, indicators used in assessing safe planetary boundaries are included. Indicators are considered with respect to recent published papers on possible catastrophic climate change and climate system tipping thresholds. They are climate-change-policy relevant. In particular, relevant policies include the 2015 Paris Agreement on “holding the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C above pre-industrial levels” and the 1992 UN Framework Convention on Climate change, which has “stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system.”

Keywords: climate change, climate change indicators, climate change trends, climate system change interactions

Procedia PDF Downloads 80
296 Digital Image Steganography with Multilayer Security

Authors: Amar Partap Singh Pharwaha, Balkrishan Jindal

Abstract:

In this paper, a new method is developed for hiding image in a digital image with multilayer security. In the proposed method, the secret image is encrypted in the first instance using a flexible matrix based symmetric key to add first layer of security. Then another layer of security is added to the secret data by encrypting the ciphered data using Pythagorean Theorem method. The ciphered data bits (4 bits) produced after double encryption are then embedded within digital image in the spatial domain using Least Significant Bits (LSBs) substitution. To improve the image quality of the stego-image, an improved form of pixel adjustment process is proposed. To evaluate the effectiveness of the proposed method, image quality metrics including Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), entropy, correlation, mean value and Universal Image Quality Index (UIQI) are measured. It has been found experimentally that the proposed method provides higher security as well as robustness. In fact, the results of this study are quite promising.

Keywords: Pythagorean theorem, pixel adjustment, ciphered data, image hiding, least significant bit, flexible matrix

Procedia PDF Downloads 313
295 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 18
294 An Efficient Resource Management Algorithm for Mobility Management in Wireless Mesh Networks

Authors: Mallikarjuna Rao Yamarthy, Subramanyam Makam Venkata, Satya Prasad Kodati

Abstract:

The main objective of the proposed work is to reduce the overall network traffic incurred by mobility management, packet delivery cost and to increase the resource utilization. The proposed algorithm, An Efficient Resource Management Algorithm (ERMA) for mobility management in wireless mesh networks, relies on pointer based mobility management scheme. Whenever a mesh client moves from one mesh router to another, the pointer is set up dynamically between the previous mesh router and current mesh router based on the distance constraints. The algorithm evaluated for signaling cost, data delivery cost and total communication cost performance metrics. The proposed algorithm is demonstrated for both internet sessions and intranet sessions. The proposed algorithm yields significantly better performance in terms of signaling cost, data delivery cost, and total communication cost.

Keywords: data delivery cost, mobility management, pointer forwarding, resource management, wireless mesh networks

Procedia PDF Downloads 337
293 Generating Insights from Data Using a Hybrid Approach

Authors: Allmin Susaiyah, Aki Härmä, Milan Petković

Abstract:

Automatic generation of insights from data using insight mining systems (IMS) is useful in many applications, such as personal health tracking, patient monitoring, and business process management. Existing IMS face challenges in controlling insight extraction, scaling to large databases, and generalising to unseen domains. In this work, we propose a hybrid approach consisting of rule-based and neural components for generating insights from data while overcoming the aforementioned challenges. Firstly, a rule-based data 2CNL component is used to extract statistically significant insights from data and represent them in a controlled natural language (CNL). Secondly, a BERTSum-based CNL2NL component is used to convert these CNLs into natural language texts. We improve the model using task-specific and domain-specific fine-tuning. Our approach has been evaluated using statistical techniques and standard evaluation metrics. We overcame the aforementioned challenges and observed significant improvement with domain-specific fine-tuning.

Keywords: data mining, insight mining, natural language generation, pre-trained language models

Procedia PDF Downloads 81
292 A Review of Routing Protocols for Mobile Ad-Hoc NETworks (MANET)

Authors: Hafiza Khaddija Saman, Muhammad Sufyan

Abstract:

The increase in availability and popularity of mobile wireless devices has led researchers to develop a wide variety of Mobile Ad-hoc Networking (MANET) protocols to exploit the unique communication opportunities presented by these devices. Devices are able to communicate directly using the wireless spectrum in a peer-to-peer fashion, and route messages through intermediate nodes, however, the nature of wireless shared communication and mobile devices result in many routing and security challenges which must be addressed before deploying a MANET. In this paper, we investigate the range of MANET routing protocols available and discuss the functionalities of several ranging from early protocols such as DSDV to more advanced such as MAODV, our protocol study focuses upon works by Perkins in developing and improving MANET routing. A range of literature relating to the field of MANET routing was identified and reviewed, we also reviewed literature on the topic of securing AODV based MANETs as this may be the most popular MANET protocol. The literature review identified a number of trends within research papers such as exclusive use of the random waypoint mobility model, excluding key metrics from simulation results and not comparing protocol performance against available alternatives.

Keywords: protocol, MANET, ad-Hoc, communication

Procedia PDF Downloads 222
291 Leveraging Quality Metrics in Voting Model Based Thread Retrieval

Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim

Abstract:

Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.

Keywords: content quality, forum search, thread retrieval, voting techniques

Procedia PDF Downloads 183
290 Predicting the Exposure Level of Airborne Contaminants in Occupational Settings via the Well-Mixed Room Model

Authors: Alireza Fallahfard, Ludwig Vinches, Stephane Halle

Abstract:

In the workplace, the exposure level of airborne contaminants should be evaluated due to health and safety issues. It can be done by numerical models or experimental measurements, but the numerical approach can be useful when it is challenging to perform experiments. One of the simplest models is the well-mixed room (WMR) model, which has shown its usefulness to predict inhalation exposure in many situations. However, since the WMR is limited to gases and vapors, it cannot be used to predict exposure to aerosols. The main objective is to modify the WMR model to expand its application to exposure scenarios involving aerosols. To reach this objective, the standard WMR model has been modified to consider the deposition of particles by gravitational settling and Brownian and turbulent deposition. Three deposition models were implemented in the model. The time-dependent concentrations of airborne particles predicted by the model were compared to experimental results conducted in a 0.512 m3 chamber. Polystyrene particles of 1, 2, and 3 µm in aerodynamic diameter were generated with a nebulizer under two air changes per hour (ACH). The well-mixed condition and chamber ACH were determined by the tracer gas decay method. The mean friction velocity on the chamber surfaces as one of the input variables for the deposition models was determined by computational fluid dynamics (CFD) simulation. For the experimental procedure, the particles were generated until reaching the steady-state condition (emission period). Then generation stopped, and concentration measurements continued until reaching the background concentration (decay period). The results of the tracer gas decay tests revealed that the ACHs of the chamber were: 1.4 and 3.0, and the well-mixed condition was achieved. The CFD results showed the average mean friction velocity and their standard deviations for the lowest and highest ACH were (8.87 ± 0.36) ×10-2 m/s and (8.88 ± 0.38) ×10-2 m/s, respectively. The numerical results indicated the difference between the predicted deposition rates by the three deposition models was less than 2%. The experimental and numerical aerosol concentrations were compared in the emission period and decay period. In both periods, the prediction accuracy of the modified model improved in comparison with the classic WMR model. However, there is still a difference between the actual value and the predicted value. In the emission period, the modified WMR results closely follow the experimental data. However, the model significantly overestimates the experimental results during the decay period. This finding is mainly due to an underestimation of the deposition rate in the model and uncertainty related to measurement devices and particle size distribution. Comparing the experimental and numerical deposition rates revealed that the actual particle deposition rate is significant, but the deposition mechanisms considered in the model were ten times lower than the experimental value. Thus, particle deposition was significant and will affect the airborne concentration in occupational settings, and it should be considered in the airborne exposure prediction model. The role of other removal mechanisms should be investigated.

Keywords: aerosol, CFD, exposure assessment, occupational settings, well-mixed room model, zonal model

Procedia PDF Downloads 82
289 Integrated Grey Rational Analysis-Standard Deviation Method for Handover in Heterogeneous Networks

Authors: Mohanad Alhabo, Naveed Nawaz, Mahmoud Al-Faris

Abstract:

The dense deployment of small cells is a promising solution to enhance the coverage and capacity of the heterogeneous networks (HetNets). However, the unplanned deployment could bring new challenges to the network ranging from interference, unnecessary handovers and handover failures. This will cause a degradation in the quality of service (QoS) delivered to the end user. In this paper, we propose an integrated Grey Rational Analysis Standard Deviation based handover method (GRA-SD) for HetNet. The proposed method integrates the Standard Deviation (SD) technique to acquire the weight of the handover metrics and the GRA method to select the best handover base station. The performance of the GRA-SD method is evaluated and compared with the traditional Multiple Attribute Decision Making (MADM) methods including Simple Additive Weighting (SAW) and VIKOR methods. Results reveal that the proposed method has outperformed the other methods in terms of minimizing the number of frequent unnecessary handovers and handover failures, in addition to improving the energy efficiency.

Keywords: energy efficiency, handover, HetNets, MADM, small cells

Procedia PDF Downloads 94
288 Comparison of Techniques for Detection and Diagnosis of Eccentricity in the Air-Gap Fault in Induction Motors

Authors: Abrahão S. Fontes, Carlos A. V. Cardoso, Levi P. B. Oliveira

Abstract:

The induction motors are used worldwide in various industries. Several maintenance techniques are applied to increase the operating time and the lifespan of these motors. Among these, the predictive maintenance techniques such as Motor Current Signature Analysis (MCSA), Motor Square Current Signature Analysis (MSCSA), Park's Vector Approach (PVA) and Park's Vector Square Modulus (PVSM) are used to detect and diagnose faults in electric motors, characterized by patterns in the stator current frequency spectrum. In this article, these techniques are applied and compared on a real motor, which has the fault of eccentricity in the air-gap. It was used as a theoretical model of an electric induction motor without fault in order to assist comparison between the stator current frequency spectrum patterns with and without faults. Metrics were purposed and applied to evaluate the sensitivity of each technique fault detection. The results presented here show that the above techniques are suitable for the fault of eccentricity in the air gap, whose comparison between these showed the suitability of each one.

Keywords: eccentricity in the air-gap, fault diagnosis, induction motors, predictive maintenance

Procedia PDF Downloads 325
287 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison

Authors: Saugata Bose, Ritambhra Korpal

Abstract:

The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.

Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram

Procedia PDF Downloads 336
286 Aggregate Fluctuations and the Global Network of Input-Output Linkages

Authors: Alexander Hempfing

Abstract:

The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.

Keywords: economic integration, industrial organization, input-output economics, network economics, production networks

Procedia PDF Downloads 244
285 Bibliometrics of 'Community Garden' and Associated Keywords

Authors: Guilherme Reis Ranieri, Guilherme Leite Gaudereto, Michele Toledo, Luis Fernando Amato-Lourenco, Thais Mauad

Abstract:

Given the importance to urban sustainability and the growing relevance of the term ‘community garden’, this paper aims to conduct a bibliometric analysis of the term. Using SCOPUS as database, we analyzed 105 articles that contained the keywords ‘community garden’, and conducted a cluster analysis with the associated keywords. As results, we found 205 articles and 404 different keywords. Among the keywords, 334 are not repeated anytime, 44 are repeated 2 times and 9 appear 3 times. The most frequent keywords are: community food systems (74), urban activism (14), Communities of practice (6), food production (6) and public rethoric (5). Within the areas, which contains more articles are: social sciences (74), environmental science (29) and agricultural and biological sciences (24).The three main countries that concentrated the papers are United States (54), Canada (15) and Australia (12). The main journal with these keywords is Local Environment (10). The first publication was in 1999, and by 2010 concentrated 30,5% of the publications. The other 69,5% occurred 2010 to 2015, indicating an increase in frequency. We can conclude that the papers, based on the distribution of the keywords, are still scattered in various research topics and presents high variability between subjects.

Keywords: bibliometrics, community garden, metrics, urban agriculture

Procedia PDF Downloads 336
284 An Application of Lean Thinking at the Cargo Transport Area

Authors: Caroline Demartin, Natalia Camaras, Nelson Maestrelli, Max Filipe Gonçalves

Abstract:

This paper presents a case study of Lean Thinking at the cargo transport area. Lean Office principles are considered the application of Lean Thinking focusing on the service area and it is based on Lean Production concepts. Lean production is a philosophy that was born and gained ground after the Second World War when the Japanese Toyota Company developed a process of identifying and eliminating waste. Many researchers show that most part of the companies decide to adopt the principles created at Toyota especially in the manufacturing sector, but until 90’s, has no major applications for the service sector. Due to increased competition and the need for competitive advantage, many companies began to observe the lean transformation and take it as reference. In this study, a key process at a cargo transport company was analyzed using Lean Office tools and methods: a current state map was developed, main wastes were identified, some metrics were used to evaluate improvements and a priority matrix was used to identify action plans. The obtained results showed that Lean Office has a great potential to be successful applied in cargo air transport companies.

Keywords: lean production, lean office, logistic, service sector

Procedia PDF Downloads 163
283 Social Impact Evaluation in the Housing Sector

Authors: Edgard Barki, Tânia Modesto Veludo-de-Oliveira, Felipe Zambaldi

Abstract:

The social enterprise sector can be characterized as organizations that aim to solve social problems with financial sustainability and using market mechanisms. This sector has shown an increasing interest worldwide. Despite the growth and relevance of the sector, there is still a gap regarding the assessment of the social impact resulting from the initiatives of the organizations in this field. A number of metrics have been designed worldwide to evaluate the impact of social enterprises (e.g., IRIS, GIIRS, BACO), as well as some ad hoc studies that have been carried out, mainly in the microcredit sector, but there is still a gap to be filled in the development of research in social impact evaluation. Therefore, this research seeks to evaluate the social impact of two social enterprises (Terra Nova and Vivenda) in the area of housing in Brazil. To evaluate these impacts and their dimensions, we conducted an exploratory research, through three focus groups, thirty in-depth interviews and a survey with beneficiaries of both organizations. The results allowed us to evaluate how the two organizations were able to create a deep social impact in the populations served. Terra Nova has a more collective perspective, with a clear benefit of social inclusion and improvement of the community’s infrastructure, while Vivenda has a more individualized perspective, improving self-esteem, sociability and family coexistence.

Keywords: Brazil, housing, social enterprise, social impact evaluation

Procedia PDF Downloads 411
282 Application of VE in Healthcare Services: An Overview of Healthcare Facility

Authors: Safeer Ahmad, Pratheek Sudhakran, M. Arif Kamal, Tarique Anwar

Abstract:

In Healthcare facility designing, Efficient MEP services are very crucial because the built environment not only affects patients and family but also Healthcare staff and their outcomes. This paper shall cover the basics of Value engineering and its different phases that can be implemented to the MEP Designing stage for Healthcare facility optimization, also VE can improve the product cost the unnecessary costs associated with healthcare services. This paper explores Healthcare facility services and their Value engineering Job plan for the successful application of the VE technique by conducting a Workshop with end-users, designing team and associate experts shall be carried out using certain concepts, tools, methods and mechanism developed to achieve the purpose of selecting what is actually appropriate and ideal among many value engineering processes and tools that have long proven their ability to enhance the value by following the concept of Total quality management while achieving the most efficient resources allocation to satisfy the key functions and requirements of the project without sacrificing the targeted level of service for all design metrics. Detail study has been discussed with analysis been carried out by this process to achieve a better outcome, Various tools are used for the Analysis of the product at different phases used, at the end the results obtained after implementation of techniques are discussed.

Keywords: value engineering, healthcare facility, design, services

Procedia PDF Downloads 165
281 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis

Authors: Alexander Marx

Abstract:

Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.

Keywords: value at risk, financial market risk, banking, quantitative risk management

Procedia PDF Downloads 65
280 Near Infrared Spectrometry to Determine the Quality of Milk, Experimental Design Setup and Chemometrics: Review

Authors: Meghana Shankara, Priyadarshini Natarajan

Abstract:

Infrared (IR) spectroscopy has revolutionized the way we look at materials around us. Unraveling the pattern in the molecular spectra of materials to analyze the composition and properties of it has been one of the most interesting challenges in modern science. Applications of the IR spectrometry are numerous in the field’s pharmaceuticals, health, food and nutrition, oils, agriculture, construction, polymers, beverage, fabrics and much more limited only by the curiosity of the people. Near Infrared (NIR) spectrometry is applied robustly in analyzing the solids and liquid substances because of its non-destructive analysis method. In this paper, we have reviewed the application of NIR spectrometry in milk quality analysis and have presented the modes of measurement applied in NIRS measurement setup, Design of Experiment (DoE), classification/quantification algorithms used in the case of milk composition prediction like Fat%, Protein%, Lactose%, Solids Not Fat (SNF%) along with different approaches for adulterant identification. We have also discussed the important NIR ranges for the chosen milk parameters. The performance metrics used in the comparison of the various Chemometric approaches include Root Mean Square Error (RMSE), R^2, slope, offset, sensitivity, specificity and accuracy

Keywords: chemometrics, design of experiment, milk quality analysis, NIRS measurement modes

Procedia PDF Downloads 244
279 The Role of Named Entity Recognition for Information Extraction

Authors: Girma Yohannis Bade, Olga Kolesnikova, Grigori Sidorov

Abstract:

Named entity recognition (NER) is a building block for information extraction. Though the information extraction process has been automated using a variety of techniques to find and extract a piece of relevant information from unstructured documents, the discovery of targeted knowledge still poses a number of research difficulties because of the variability and lack of structure in Web data. NER, a subtask of information extraction (IE), came to exist to smooth such difficulty. It deals with finding the proper names (named entities), such as the name of the person, country, location, organization, dates, and event in a document, and categorizing them as predetermined labels, which is an initial step in IE tasks. This survey paper presents the roles and importance of NER to IE from the perspective of different algorithms and application area domains. Thus, this paper well summarizes how researchers implemented NER in particular application areas like finance, medicine, defense, business, food science, archeology, and so on. It also outlines the three types of sequence labeling algorithms for NER such as feature-based, neural network-based, and rule-based. Finally, the state-of-the-art and evaluation metrics of NER were presented.

Keywords: the role of NER, named entity recognition, information extraction, sequence labeling algorithms, named entity application area

Procedia PDF Downloads 53
278 The Convergence of IoT and Machine Learning: A Survey of Real-time Stress Detection System

Authors: Shreyas Gambhirrao, Aditya Vichare, Aniket Tembhurne, Shahuraj Bhosale

Abstract:

In today's rapidly evolving environment, stress has emerged as a significant health concern across different age groups. Stress that isn't controlled, whether it comes from job responsibilities, health issues, or the never-ending news cycle, can have a negative effect on our well-being. The problem is further aggravated by the ongoing connection to technology. In this high-tech age, identifying and controlling stress is vital. In order to solve this health issue, the study focuses on three key metrics for stress detection: body temperature, heart rate, and galvanic skin response (GSR). These parameters along with the Support Vector Machine classifier assist the system to categorize stress into three groups: 1) Stressed, 2) Not stressed, and 3) Moderate stress. Proposed training model, a NodeMCU combined with particular sensors collects data in real-time and rapidly categorizes individuals based on their stress levels. Real-time stress detection is made possible by this creative combination of hardware and software.

Keywords: real time stress detection, NodeMCU, sensors, heart-rate, body temperature, galvanic skin response (GSR), support vector machine

Procedia PDF Downloads 47
277 Diverted Use of Contraceptives in Madagascar

Authors: Josiane Yaguibou, Ngoy Kishimba, Issiaka V. Coulibaly, Sabrina Pestilli, Falinirina Razanalison, Hantanirina V. Andremanisa

Abstract:

Background In Madagascar modern contraceptive prevalence rate increased from 18% in 2003 to 43% in 2021. Anecdotal evidence suggests that increased use and frequent stock out in public health facilities of male condoms and medroxyprogesterone acetate (MPA) can be related to diverted use of these products. This study analyzed the use of contraceptives and mode of utilization (correct or diverted) at the community level in the period 2019-2023 in Madagascar. Methodology: The study included a literature review, a quantitative survey combined with a qualitative study. It was carried out in 10 regions out of the 23 of the country. Eight regions (Bongolava, Vakinakaratra, Italy, Hautre Matsiatra, Betsiboka, Diana, Sofia and Anosy) were selected based on a study that showed existence of medroxyprogesterone acetate in pigs (MPA). The remaining 2 regions were selected due to high mCPR (Atsimo Andrefana) and to ensure coverage of all geographical zones in the country (Alaotra Mangoro). Sample random method was used, and the sample size was identified at 300 individuals per region. Zonal distribution is based on the urbanization rate for the region. 6 focus group discussions were organized in 3 regions, equally distributed between rural and urban areas. Key findings: Overall, 67% of those surveyed or their partner are currently using contraception. Injectables (MPA) are the most popular choice (33%), followed by implants and male condoms, 12% and 9%, respectively. The majority of respondents use condoms to prevent unwanted pregnancy but also to prevent STDs. Still, 43% of respondents use condoms for other purposes, reaching 52% of respondents in urban areas and 71,2% in the age group 15-18. Diverted use includes hair growth (18.9%), as a toy (18.8%), cleaning the screen of electronic devices (10 %), cleaning shoes (3.1%) and for skincare (1.6%). Injectables are the preferred method of contraception both in rural areas (35%) and urban areas (21.2%). However, diverted use of injectables was confirmed by 4% of the respondents, ranging from 3 % in rural areas to 12% in urban. The diverted use of injectables in pig rearing was to avoid pregnancy and facilitate pig’s growth. Program Implications: The study confirmed the diverted use of some contraceptives. The misuse of male condoms is among the causes of stockouts of products in public health facilities, limiting their availability for pregnancy and STDs prevention. The misuse of injectables in pigs rearing needs to be further studied to learn the full extent of the misuse and eventual implications for meat consumption. The study highlights the importance of including messages on the correct use of products during sensibilization activities. In particular, messages need to address the anecdotal and false effects of male condoms, especially amongst young people. For misuse of injectables is critical to sensibilize farmers and veterinaries on possible negative effects for humans.

Keywords: diverted use, injectables, male condoms, sensibilization

Procedia PDF Downloads 37
276 The Impact of Government Subsidies to Keep Residents Studying at Home

Authors: Melissa James Maceachern

Abstract:

This study examines a financial aid program that is designed to “keep residents at home” to attend higher education by providing financial aid as an incentive or discount in their first year of university following high school graduation. This study offers insight into financial matters for higher education students that can assist in providing policy direction for student financing. In particular, this study found that students appeared to value the bursary but none of the key metrics related to participation or conversion to the home institution indicated that the bursary impacted enrolment or participation. One key metric, student loans received by direct entry high school students did indicate a decline in the number of recipients. This study also identified accessibility issues to higher education that are of importance when considering the declining youth populations, future labour market needs and the need to sustain higher education institutions. This is undoubtedly a challenging period of time given the changing social and demographic forces within Canada. A comprehensive examination of the policy and programs to address these forces needs to be undertaken. This study highlights the importance of utilizing financial aid in combination with other policy to assist students in accessing higher education.

Keywords: accessibility, participation, financing, government

Procedia PDF Downloads 391
275 IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG

Authors: R. Gautam, P. Sastha Kanagasabai, G. N. Rathna

Abstract:

The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service.

Keywords: brain computer interface, Intel Edison, Emotiv EPOC, IoT analytics, electroencephalogram

Procedia PDF Downloads 166
274 Real-Time Network Anomaly Detection Systems Based on Machine-Learning Algorithms

Authors: Zahra Ramezanpanah, Joachim Carvallo, Aurelien Rodriguez

Abstract:

This paper aims to detect anomalies in streaming data using machine learning algorithms. In this regard, we designed two separate pipelines and evaluated the effectiveness of each separately. The first pipeline, based on supervised machine learning methods, consists of two phases. In the first phase, we trained several supervised models using the UNSW-NB15 data-set. We measured the efficiency of each using different performance metrics and selected the best model for the second phase. At the beginning of the second phase, we first, using Argus Server, sniffed a local area network. Several types of attacks were simulated and then sent the sniffed data to a running algorithm at short intervals. This algorithm can display the results of each packet of received data in real-time using the trained model. The second pipeline presented in this paper is based on unsupervised algorithms, in which a Temporal Graph Network (TGN) is used to monitor a local network. The TGN is trained to predict the probability of future states of the network based on its past behavior. Our contribution in this section is introducing an indicator to identify anomalies from these predicted probabilities.

Keywords: temporal graph network, anomaly detection, cyber security, IDS

Procedia PDF Downloads 78
273 Evaluating Performance of Value at Risk Models for the MENA Islamic Stock Market Portfolios

Authors: Abderrazek Ben Maatoug, Ibrahim Fatnassi, Wassim Ben Ayed

Abstract:

In this paper we investigate the issue of market risk quantification for Middle East and North Africa (MENA) Islamic market equity. We use Value-at-Risk (VaR) as a measure of potential risk in Islamic stock market, for long and short position, based on Riskmetrics model and the conditional parametric ARCH class model volatility with normal, student and skewed student distribution. The sample consist of daily data for the 2006-2014 of 11 Islamic stock markets indices. We conduct Kupiec and Engle and Manganelli tests to evaluate the performance for each model. The main finding of our empirical results show that (i) the superior performance of VaR models based on the Student and skewed Student distribution, for the significance level of α=1% , for all Islamic stock market indices, and for both long and short trading positions (ii) Risk Metrics model, and VaR model based on conditional volatility with normal distribution provides the best accurate VaR estimations for both long and short trading positions for a significance level of α=5%.

Keywords: value-at-risk, risk management, islamic finance, GARCH models

Procedia PDF Downloads 565
272 Automated Ultrasound Carotid Artery Image Segmentation Using Curvelet Threshold Decomposition

Authors: Latha Subbiah, Dhanalakshmi Samiappan

Abstract:

In this paper, we propose denoising Common Carotid Artery (CCA) B mode ultrasound images by a decomposition approach to curvelet thresholding and automatic segmentation of the intima media thickness and adventitia boundary. By decomposition, the local geometry of the image, its direction of gradients are well preserved. The components are combined into a single vector valued function, thus removes noise patches. Double threshold is applied to inherently remove speckle noise in the image. The denoised image is segmented by active contour without specifying seed points. Combined with level set theory, they provide sub regions with continuous boundaries. The deformable contours match to the shapes and motion of objects in the images. A curve or a surface under constraints is developed from the image with the goal that it is pulled into the necessary features of the image. Region based and boundary based information are integrated to achieve the contour. The method treats the multiplicative speckle noise in objective and subjective quality measurements and thus leads to better-segmented results. The proposed denoising method gives better performance metrics compared with other state of art denoising algorithms.

Keywords: curvelet, decomposition, levelset, ultrasound

Procedia PDF Downloads 313
271 Attention-Based ResNet for Breast Cancer Classification

Authors: Abebe Mulugojam Negash, Yongbin Yu, Ekong Favour, Bekalu Nigus Dawit, Molla Woretaw Teshome, Aynalem Birtukan Yirga

Abstract:

Breast cancer remains a significant health concern, necessitating advancements in diagnostic methodologies. Addressing this, our paper confronts the notable challenges in breast cancer classification, particularly the imbalance in datasets and the constraints in the accuracy and interpretability of prevailing deep learning approaches. We proposed an attention-based residual neural network (ResNet), which effectively combines the robust features of ResNet with an advanced attention mechanism. Enhanced through strategic data augmentation and positive weight adjustments, this approach specifically targets the issue of data imbalance. The proposed model is tested on the BreakHis dataset and achieved accuracies of 99.00%, 99.04%, 98.67%, and 98.08% in different magnifications (40X, 100X, 200X, and 400X), respectively. We evaluated the performance by using different evaluation metrics such as precision, recall, and F1-Score and made comparisons with other state-of-the-art methods. Our experiments demonstrate that the proposed model outperforms existing approaches, achieving higher accuracy in breast cancer classification.

Keywords: residual neural network, attention mechanism, positive weight, data augmentation

Procedia PDF Downloads 54
270 Neural Networks-based Acoustic Annoyance Model for Laptop Hard Disk Drive

Authors: Yichao Ma, Chengsiong Chin, Wailok Woo

Abstract:

Since the last decade, there has been a rapid growth in digital multimedia, such as high-resolution media files and three-dimentional movies. Hence, there is a need for large digital storage such as Hard Disk Drive (HDD). As such, users expect to have a quieter HDD in their laptop. In this paper, a jury test has been conducted on a group of 34 people where 17 of them are students who is the potential consumer, and the remaining are engineers who know the HDD. A total 13 HDD sound samples have been selected from over hundred HDD noise recordings. These samples are selected based on an agreed subjective feeling. The samples are played to the participants using head acoustic playback system which enabled them to experience as similar as possible the same environment as have been recorded. Analysis has been conducted and the obtained results have indicated different group has different perception over the noises. Two neural network-based acoustic annoyance models are established based on back propagation neural network. Four psychoacoustic metrics, loudness, sharpness, roughness and fluctuation strength, are used as the input of the model, and the subjective evaluation results are taken as the output. The developed models are reasonably accurate in simulating both training and test samples.

Keywords: hdd noise, jury test, neural network model, psychoacoustic annoyance

Procedia PDF Downloads 403