Search results for: national models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10880

Search results for: national models

9380 Applying Arima Data Mining Techniques to ERP to Generate Sales Demand Forecasting: A Case Study

Authors: Ghaleb Y. Abbasi, Israa Abu Rumman

Abstract:

This paper modeled sales history archived from 2012 to 2015 bulked in monthly bins for five products for a medical supply company in Jordan. The sales forecasts and extracted consistent patterns in the sales demand history from the Enterprise Resource Planning (ERP) system were used to predict future forecasting and generate sales demand forecasting using time series analysis statistical technique called Auto Regressive Integrated Moving Average (ARIMA). This was used to model and estimate realistic sales demand patterns and predict future forecasting to decide the best models for five products. Analysis revealed that the current replenishment system indicated inventory overstocking.

Keywords: ARIMA models, sales demand forecasting, time series, R code

Procedia PDF Downloads 384
9379 Emerging Cyber Threats and Cognitive Vulnerabilities: Cyberterrorism

Authors: Oludare Isaac Abiodun, Esther Omolara Abiodun

Abstract:

The purpose of this paper is to demonstrate that cyberterrorism is existing and poses a threat to computer security and national security. Nowadays, people have become excitedly dependent upon computers, phones, the Internet, and the Internet of things systems to share information, communicate, conduct a search, etc. However, these network systems are at risk from a different source that is known and unknown. These network systems risk being caused by some malicious individuals, groups, organizations, or governments, they take advantage of vulnerabilities in the computer system to hawk sensitive information from people, organizations, or governments. In doing so, they are engaging themselves in computer threats, crime, and terrorism, thereby making the use of computers insecure for others. The threat of cyberterrorism is of various forms and ranges from one country to another country. These threats include disrupting communications and information, stealing data, destroying data, leaking, and breaching data, interfering with messages and networks, and in some cases, demanding financial rewards for stolen data. Hence, this study identifies many ways that cyberterrorists utilize the Internet as a tool to advance their malicious mission, which negatively affects computer security and safety. One could identify causes for disparate anomaly behaviors and the theoretical, ideological, and current forms of the likelihood of cyberterrorism. Therefore, for a countermeasure, this paper proposes the use of previous and current computer security models as found in the literature to help in countering cyberterrorism

Keywords: cyberterrorism, computer security, information, internet, terrorism, threat, digital forensic solution

Procedia PDF Downloads 96
9378 Implementation of Algorithm K-Means for Grouping District/City in Central Java Based on Macro Economic Indicators

Authors: Nur Aziza Luxfiati

Abstract:

Clustering is partitioning data sets into sub-sets or groups in such a way that elements certain properties have shared property settings with a high level of similarity within one group and a low level of similarity between groups. . The K-Means algorithm is one of thealgorithmsclustering as a grouping tool that is most widely used in scientific and industrial applications because the basic idea of the kalgorithm is-means very simple. In this research, applying the technique of clustering using the k-means algorithm as a method of solving the problem of national development imbalances between regions in Central Java Province based on macroeconomic indicators. The data sample used is secondary data obtained from the Central Java Provincial Statistics Agency regarding macroeconomic indicator data which is part of the publication of the 2019 National Socio-Economic Survey (Susenas) data. score and determine the number of clusters (k) using the elbow method. After the clustering process is carried out, the validation is tested using themethodsBetween-Class Variation (BCV) and Within-Class Variation (WCV). The results showed that detection outlier using z-score normalization showed no outliers. In addition, the results of the clustering test obtained a ratio value that was not high, namely 0.011%. There are two district/city clusters in Central Java Province which have economic similarities based on the variables used, namely the first cluster with a high economic level consisting of 13 districts/cities and theclustersecondwith a low economic level consisting of 22 districts/cities. And in the cluster second, namely, between low economies, the authors grouped districts/cities based on similarities to macroeconomic indicators such as 20 districts of Gross Regional Domestic Product, with a Poverty Depth Index of 19 districts, with 5 districts in Human Development, and as many as Open Unemployment Rate. 10 districts.

Keywords: clustering, K-Means algorithm, macroeconomic indicators, inequality, national development

Procedia PDF Downloads 157
9377 Environmental Teachers’ Perceptions about Science-Technology-Society (STS) Education

Authors: Christiana Fwenji Zumyil, Toma Maina Antip

Abstract:

Environmental Science subject is currently not an independent subject taught in secondary schools in Nigeria like Biology, Agricultural Science, Chemistry, Geography and other subjects that students take final exams (West Africa Examination Council, WAEC, National Examination Council, NEC, National Board for Technical Education, NABTED)., on it but its elements/topics/contents are integrated into the curriculum of the subjects mentioned, and because of that, it becomes difficult to know what should be taught and how it should be taught. Currently, it is still difficult to implement student-centered strategies in the classroom. Through this study, we sought to diagnose the difficulties, advantages and perceptions that Environmental Teachers experience when conceiving and implementing Science-Technology-Society (STS) strategies in SS2 classes at the Secondary Education level. Four semi-structured interviews were conducted with Secondary School Environmental Teachers. Despite the difficulties found, the advantages, the motivation and the involvement of the students that this teaching perspective enables to lead the teacher to continue developing and implementing STS strategies in the classroom.

Keywords: environment, science, technology, society, science-technology-society, science education, secondary teaching

Procedia PDF Downloads 99
9376 Validating Chronic Kidney Disease-Specific Risk Factors for Cardiovascular Events Using National Data: A Retrospective Cohort Study of the Nationwide Inpatient Sample

Authors: Fidelis E. Uwumiro, Chimaobi O. Nwevo, Favour O. Osemwota, Victory O. Okpujie, Emeka S. Obi, Omamuyovbi F. Nwoagbe, Ejiroghene Tejere, Joycelyn Adjei-Mensah, Christopher N. Ekeh, Charles T. Ogbodo

Abstract:

Several risk factors associated with cardiovascular events have been identified as specific to Chronic Kidney Disease (CKD). This study endeavors to validate these CKD-specific risk factors using up-to-date national-level data, thereby highlighting the crucial significance of confirming the validity and generalizability of findings obtained from previous studies conducted on smaller patient populations. The study utilized the nationwide inpatient sample database to identify adult hospitalizations for CKD from 2016 to 2020, employing validated ICD-10-CM/PCS codes. A comprehensive literature review was conducted to identify both traditional and CKD-specific risk factors associated with cardiovascular events. Risk factors and cardiovascular events were defined using a combination of ICD-10-CM/PCS codes and statistical commands. Only risk factors with specific ICD-10 codes and hospitalizations with complete data were included in the study. Cardiovascular events of interest included cardiac arrhythmias, sudden cardiac death, acute heart failure, and acute coronary syndromes. Univariate and multivariate regression models were employed to evaluate the association between chronic kidney disease-specific risk factors and cardiovascular events while adjusting for the impact of traditional CV risk factors such as old age, hypertension, diabetes, hypercholesterolemia, inactivity, and smoking. A total of 690,375 hospitalizations for CKD were included in the analysis. The study population was predominantly male (375,564, 54.4%) and primarily received care at urban teaching hospitals (512,258, 74.2%). The mean age of the study population was 61 years (SD 0.1), and 86.7% (598,555) had a CCI of 3 or more. At least one traditional risk factor for CV events was present in 84.1% of all hospitalizations (580,605), while 65.4% (451,505) included at least one CKD-specific risk factor for CV events. The incidence of CV events in the study was as follows: acute coronary syndromes (41,422; 6%), sudden cardiac death (13,807; 2%), heart failure (404,560; 58.6%), and cardiac arrhythmias (124,267; 18%). 91.7% (113,912) of all cardiac arrhythmias were atrial fibrillations. Significant odds of cardiovascular events on multivariate analyses included: malnutrition (aOR: 1.09; 95% CI: 1.06–1.13; p<0.001), post-dialytic hypotension (aOR: 1.34; 95% CI: 1.26–1.42; p<0.001), thrombophilia (aOR: 1.46; 95% CI: 1.29–1.65; p<0.001), sleep disorder (aOR: 1.17; 95% CI: 1.09–1.25; p<0.001), and post-renal transplant immunosuppressive therapy (aOR: 1.39; 95% CI: 1.26–1.53; p<0.001). The study validated malnutrition, post-dialytic hypotension, thrombophilia, sleep disorders, and post-renal transplant immunosuppressive therapy, highlighting their association with increased risk for cardiovascular events in CKD patients. No significant association was observed between uremic syndrome, hyperhomocysteinemia, hyperuricemia, hypertriglyceridemia, leptin levels, carnitine deficiency, anemia, and the odds of experiencing cardiovascular events.

Keywords: cardiovascular events, cardiovascular risk factors in CKD, chronic kidney disease, nationwide inpatient sample

Procedia PDF Downloads 78
9375 The Role of Gender Ideology in the Legality of Same-Sex Marriage: A Cross-National Analysis

Authors: Amber Salamanca-Blazek

Abstract:

This paper explores the connection between gender ideology and the legality of same-sex marriage cross-nationally. The author questions what role gender ideology plays in the cultural shift concerning same-sex marriage currently underway around the world and the variations in the legal treatment of same-sex marriage at the national level. Existing literature on gender, gender ideology, the role of gender ideology in traditional and same-sex marriage, and the extent to which this connection has previously been examined is explored. Also, the author explores the relationship between gender ideology and the legality of same-sex marriage in three countries with the differing legality of same-sex marriage - The United States, where same-sex marriage was legalized in 2015, Australia, where same-sex marriage was legalized in 2017, and Iran, where the death penalty for homosexuality still exists. A comparison of gender ideology frameworks and an analysis of the political rhetoric surrounding same-sex marriage in each country are performed. It is argued that the important role of gender ideology in the legality of same-sex marriage has been greatly ignored and is in need of increased attention to assist gay rights activists in their framework. The link of gender ideology and patriarchal authority between the gay rights movement and the women’s rights movement are subsequently discussed. The author argues that because of this linkage between movements, there is a necessity for joint frameworks. Suggestions for future research are also provided.

Keywords: gender ideology, same-sex marriage, same-sex marriage legality, women's rights movement

Procedia PDF Downloads 242
9374 Advances in Machine Learning and Deep Learning Techniques for Image Classification and Clustering

Authors: R. Nandhini, Gaurab Mudbhari

Abstract:

Ranging from the field of health care to self-driving cars, machine learning and deep learning algorithms have revolutionized the field with the proper utilization of images and visual-oriented data. Segmentation, regression, classification, clustering, dimensionality reduction, etc., are some of the Machine Learning tasks that helped Machine Learning and Deep Learning models to become state-of-the-art models for the field where images are key datasets. Among these tasks, classification and clustering are essential but difficult because of the intricate and high-dimensional characteristics of image data. This finding examines and assesses advanced techniques in supervised classification and unsupervised clustering for image datasets, emphasizing the relative efficiency of Convolutional Neural Networks (CNNs), Vision Transformers (ViTs), Deep Embedded Clustering (DEC), and self-supervised learning approaches. Due to the distinctive structural attributes present in images, conventional methods often fail to effectively capture spatial patterns, resulting in the development of models that utilize more advanced architectures and attention mechanisms. In image classification, we investigated both CNNs and ViTs. One of the most promising models, which is very much known for its ability to detect spatial hierarchies, is CNN, and it serves as a core model in our study. On the other hand, ViT is another model that also serves as a core model, reflecting a modern classification method that uses a self-attention mechanism which makes them more robust as this self-attention mechanism allows them to lean global dependencies in images without relying on convolutional layers. This paper evaluates the performance of these two architectures based on accuracy, precision, recall, and F1-score across different image datasets, analyzing their appropriateness for various categories of images. In the domain of clustering, we assess DEC, Variational Autoencoders (VAEs), and conventional clustering techniques like k-means, which are used on embeddings derived from CNN models. DEC, a prominent model in the field of clustering, has gained the attention of many ML engineers because of its ability to combine feature learning and clustering into a single framework and its main goal is to improve clustering quality through better feature representation. VAEs, on the other hand, are pretty well known for using latent embeddings for grouping similar images without requiring for prior label by utilizing the probabilistic clustering method.

Keywords: machine learning, deep learning, image classification, image clustering

Procedia PDF Downloads 6
9373 The Use of Drones in Measuring Environmental Impacts of the Forest Garden Approach

Authors: Andrew J. Zacharias

Abstract:

The forest garden approach (FGA) was established by Trees for the Future (TREES) over the organization’s 30 years of agroforestry projects in Sub-Saharan Africa. This method transforms traditional agricultural systems into highly managed gardens that produce food and marketable products year-round. The effects of the FGA on food security, dietary diversity, and economic resilience have been measured closely, and TREES has begun to closely monitor the environmental impacts through the use of sensors mounted on unmanned aerial vehicles, commonly known as 'drones'. These drones collect thousands of pictures to create 3-D models in both the visible and the near-infrared wavelengths. Analysis of these models provides TREES with quantitative and qualitative evidence of improvements to the annual above-ground biomass and leaf area indices, as measured in-situ using NDVI calculations.

Keywords: agroforestry, biomass, drones, NDVI

Procedia PDF Downloads 155
9372 Eco-Politics of Infrastructure Development in and Around Protected Areas in Kenya: The Case of Nairobi National Park

Authors: Teresa Wanjiru Mbatia

Abstract:

On 7th June 2011, the government Minister of Roads in Kenya announced the proposed construction of a major highway known as a southern bypass to run on the northern border of the Nairobi National Park. The following day on 8th June 2011, the chairperson of the Friends of Nairobi National Park (FONNAP) posted a protest statement on their website, with the heading, ‘Nairobi Park is Not a cake’ alerting its members and conservation groups, with the aim of getting support to the campaign against the government’s intention to hive off a section of the park for road construction. This was the first and earliest statement that led to a series of other events that culminated in conservationists and some other members of the public campaign against the government’s plan to hive off sections of the park to build road and railway infrastructure in or around the park. Together with other non-state actors, mostly non-governmental organisations in conservation/environment and tourism businesses, FoNNAP issued a series of other statements on social, print and electronic media to battle against road and railway construction. This paper examined the strategies, outcomes and interests of actors involved in opposing/proposing the development of transport infrastructure in and around the Nairobi National Park. Specifically, the objectives were to analyse the: (1) Arguments put forward by the eco-warriors to protest infrastructure development; (2) Background and interests of the eco-warriors; (3) Needs/interests and opinions of ordinary common citizens on transport infrastructural development, particularly in and around the urban nature reserve and (4) Final outcomes of the eco-politics surrounding infrastructure development in and around Nairobi National Park. The methodological approach used was environmental history and the social construction of nature. The study collected combined qualitative data using four main approaches, the grounded theory approach, narratives, case studies and a phenomenological approach. The information collected was analysed using critical discourse analysis. The major findings of the study were that under the guise of “public participation,” influential non-state actors have the capacity to perpetuate social-spatial inequalities in the form of curtailing the majority from accessing common public goods. A case in point in this study is how the efforts of powerful conservationists, environmentalists, and tourism businesspersons managed to stall the construction of much-needed road and railway infrastructure severally through litigations in lengthy environmental court processes involving injunctions and stop orders to the government bodies in charge. Moreover, powerful non-state actors were found to have formed informal and sometimes formal coalitions with politicians with selfish interests, which serves to deepen the exclusionary practices and the common good. The study concludes that mostly composed of certain types of elites (NGOs, business communities, politicians and privileged social-cultural groups), non-state actors have used participatory policies to advance their own interests at the expense of the majority whom they claim to represent. These practices are traced to the historically unjust social, political, and economic forces involved in the production of space in Nairobi.

Keywords: eco-politics, exclusion, infrastructure, Nairobi national park, non-state actors, protests

Procedia PDF Downloads 179
9371 An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models

Authors: Christopher Godwin Udomboso, Angela Unna Chukwu, Isaac Kwame Dontwi

Abstract:

In selecting a Statistical Neural Network model, the Network Information Criterion (NIC) has been observed to be sample biased, because it does not account for sample sizes. The selection of a model from a set of fitted candidate models requires objective data-driven criteria. In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC), based on Kullback’s symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The analyses show that on a general note, the ANIC improves model selection in more sample sizes than does the NIC.

Keywords: statistical neural network, network information criterion, adjusted network, information criterion, transfer function

Procedia PDF Downloads 565
9370 Experimental Parameters’ Effects on the Electrical Discharge Machining Performances

Authors: Asmae Tafraouti, Yasmina Layouni, Pascal Kleimann

Abstract:

The growing market for Microsystems (MST) and Micro-Electromechanical Systems (MEMS) is driving the research for alternative manufacturing techniques to microelectronics-based technologies, which are generally expensive and time-consuming. Hot-embossing and micro-injection modeling of thermoplastics appear to be industrially viable processes. However, both require the use of master models, usually made in hard materials such as steel. These master models cannot be fabricated using standard microelectronics processes. Thus, other micromachining processes are used, such as laser machining or micro-electrical discharge machining (µEDM). In this work, µEDM has been used. The principle of µEDM is based on the use of a thin cylindrical micro-tool that erodes the workpiece surface. The two electrodes are immersed in a dielectric with a distance of a few micrometers (gap). When an electrical voltage is applied between the two electrodes, electrical discharges are generated, which cause material machining. In order to produce master models with high resolution and smooth surfaces, it is necessary to well control the discharge mechanism. However, several problems are encountered, such as a random electrical discharge process, the fluctuation of the discharge energy, the electrodes' polarity inversion, and the wear of the micro-tool. The effect of different parameters, such as the applied voltage, the working capacitor, the micro-tool diameter, and the initial gap, has been studied. This analysis helps to improve the machining performances, such as the workpiece surface condition and the lateral crater's gap.

Keywords: craters, electrical discharges, micro-electrical discharge machining, microsystems

Procedia PDF Downloads 72
9369 Electrical Load Estimation Using Estimated Fuzzy Linear Parameters

Authors: Bader Alkandari, Jamal Y. Madouh, Ahmad M. Alkandari, Anwar A. Alnaqi

Abstract:

A new formulation of fuzzy linear estimation problem is presented. It is formulated as a linear programming problem. The objective is to minimize the spread of the data points, taking into consideration the type of the membership function of the fuzzy parameters to satisfy the constraints on each measurement point and to insure that the original membership is included in the estimated membership. Different models are developed for a fuzzy triangular membership. The proposed models are applied to different examples from the area of fuzzy linear regression and finally to different examples for estimating the electrical load on a busbar. It had been found that the proposed technique is more suited for electrical load estimation, since the nature of the load is characterized by the uncertainty and vagueness.

Keywords: fuzzy regression, load estimation, fuzzy linear parameters, electrical load estimation

Procedia PDF Downloads 539
9368 Finite Element-Based Stability Analysis of Roadside Settlements Slopes from Barpak to Yamagaun through Laprak Village of Gorkha, an Epicentral Location after the 7.8Mw 2015 Barpak, Gorkha, Nepal Earthquake

Authors: N. P. Bhandary, R. C. Tiwari, R. Yatabe

Abstract:

The research employs finite element method to evaluate the stability of roadside settlements slopes from Barpak to Yamagaon through Laprak village of Gorkha, Nepal after the 7.8Mw 2015 Barpak, Gorkha, Nepal earthquake. It includes three major villages of Gorkha, i.e., Barpak, Laprak and Yamagaun that were devastated by 2015 Gorkhas’ earthquake. The road head distance from the Barpak to Laprak and Laprak to Yamagaun are about 14 and 29km respectively. The epicentral distance of main shock of magnitude 7.8 and aftershock of magnitude 6.6 were respectively 7 and 11 kilometers (South-East) far from the Barpak village nearer to Laprak and Yamagaon. It is also believed that the epicenter of the main shock as said until now was not in the Barpak village, it was somewhere near to the Yamagaun village. The chaos that they had experienced during the earthquake in the Yamagaun was much more higher than the Barpak. In this context, we have carried out a detailed study to investigate the stability of Yamagaun settlements slope as a case study, where ground fissures, ground settlement, multiple cracks and toe failures are the most severe. In this regard, the stability issues of existing settlements and proposed road alignment, on the Yamagaon village slope are addressed, which is surrounded by many newly activated landslides. Looking at the importance of this issue, field survey is carried out to understand the behavior of ground fissures and multiple failure characteristics of the slopes. The results suggest that the Yamgaun slope in Profile 2-2, 3-3 and 4-4 are not safe enough for infrastructure development even in the normal soil slope conditions as per 2, 3 and 4 material models; however, the slope seems quite safe for at Profile 1-1 for all 4 material models. The result also indicates that the first three profiles are marginally safe for 2, 3 and 4 material models respectively. The Profile 4-4 is not safe enough for all 4 material models. Thus, Profile 4-4 needs a special care to make the slope stable.

Keywords: earthquake, finite element method, landslide, stability

Procedia PDF Downloads 346
9367 A Bathtub Curve from Nonparametric Model

Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos

Abstract:

This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.

Keywords: bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution

Procedia PDF Downloads 443
9366 Epigenetic Drugs for Major Depressive Disorder: A Critical Appraisal of Available Studies

Authors: Aniket Kumar, Jacob Peedicayil

Abstract:

Major depressive disorder (MDD) is a common and important psychiatric disorder. Several clinical features of MDD suggest an epigenetic basis for its pathogenesis. Since epigenetics (heritable changes in gene expression not involving changes in DNA sequence) may underlie the pathogenesis of MDD, epigenetic drugs such as DNA methyltransferase inhibitors (DNMTi) and histone deactylase inhibitors (HDACi) may be useful for treating MDD. The available literature indexed in Pubmed on preclinical drug trials of epigenetic drugs for the treatment of MDD was investigated. The search terms we used were ‘depression’ or ‘depressive’ and ‘HDACi’ or ‘DNMTi’. Among epigenetic drugs, it was found that there were 3 preclinical trials using HDACi and 3 using DNMTi for the treatment of MDD. All the trials were conducted on rodents (mice or rats). The animal models of depression that were used were: learned helplessness-induced animal model, forced swim test, open field test, and the tail suspension test. One study used a genetic rat model of depression (the Flinders Sensitive Line). The HDACi that were tested were: sodium butyrate, compound 60 (Cpd-60), and valproic acid. The DNMTi that were tested were: 5-azacytidine and decitabine. Among the three preclinical trials using HDACi, all showed an antidepressant effect in animal models of depression. Among the 3 preclinical trials using DNMTi also, all showed an antidepressant effect in animal models of depression. Thus, epigenetic drugs, namely, HDACi and DNMTi, may prove to be useful in the treatment of MDD and merit further investigation for the treatment of this disorder.

Keywords: DNA methylation, drug discovery, epigenetics, major depressive disorder

Procedia PDF Downloads 185
9365 A Biomechanical Model for the Idiopathic Scoliosis Using the Antalgic-Trak Technology

Authors: Joao Fialho

Abstract:

The mathematical modelling of idiopathic scoliosis has been studied throughout the years. The models presented on those papers are based on the orthotic stabilization of the idiopathic scoliosis, which are based on a transversal force being applied to the human spine on a continuous form. When considering the ATT (Antalgic-Trak Technology) device, the existent models cannot be used, as the type of forces applied are no longer transversal nor applied in a continuous manner. In this device, vertical traction is applied. In this study we propose to model the idiopathic scoliosis, using the ATT (Antalgic-Trak Technology) device, and with the parameters obtained from the mathematical modeling, set up a case-by-case individualized therapy plan, for each patient.

Keywords: idiopathic scoliosis, mathematical modelling, human spine, Antalgic-Trak technology

Procedia PDF Downloads 268
9364 Birth Path and the Vitality of Caring Models in the Continuity of Midwifery

Authors: Elnaz Lalezari, Ramin Ghasemi Shaya

Abstract:

The birth way is influenced by a fracture within the quiet care handle, making a brokenness of this final one. The pregnant lady has got to interface with numerous experts, both amid the pregnancy, the childbirth, and the puerperium. Be that as it may, amid the final ten a long time, there has been an expanding of the pregnancy care worked by the midwife, who is considered to be the administrator with the correct competences, who can beware of each pregnancy and may profit herself of other professionals' commitments in arrange to make strides the results of maternal and neonatal health. To confirm whether there are proofs of viability that bolster the caseload birthing assistance care show, and in case it is conceivable to apply this show within the birth way in Italy. A amendment of writing has been done utilizing a few look motor (Google, Bing) and particular databases (MEDLINE, CINAHL, Embase, Domestic - ClinicalTrials.gov). There has, too, been a discussion of the Italian directions, the national rules, and the proposals of WHO. Results: The look string, legitimately adjusted to the three databases, has given the taking after comes about: MEDLINE 64 articles, CINAHL 94 articles, Embase 88 articles. From this choice, 14 articles have been extricated: 1 orderly survey, 3 controlled arbitrary trial, 7 observational ponders, 3 subjective studies. The caseload maternity care appears to be an successful and dependable organisational/caring strategy. It reacts to the criterions of quality and security, to the requirements of ladies not as it were amid the pregnancy but moreover amid the post-partum stage. For these reasons, it appears exceptionally valuable also for the birth way within the Italian reality.

Keywords: midwifery, care, caseload, maternity

Procedia PDF Downloads 129
9363 On the Use of Analytical Performance Models to Design a High-Performance Active Queue Management Scheme

Authors: Shahram Jamali, Samira Hamed

Abstract:

One of the open issues in Random Early Detection (RED) algorithm is how to set its parameters to reach high performance for the dynamic conditions of the network. Although original RED uses fixed values for its parameters, this paper follows a model-based approach to upgrade performance of the RED algorithm. It models the routers queue behavior by using the Markov model and uses this model to predict future conditions of the queue. This prediction helps the proposed algorithm to make some tunings over RED's parameters and provide efficiency and better performance. Widespread packet level simulations confirm that the proposed algorithm, called Markov-RED, outperforms RED and FARED in terms of queue stability, bottleneck utilization and dropped packets count.

Keywords: active queue management, RED, Markov model, random early detection algorithm

Procedia PDF Downloads 539
9362 Ontology Expansion via Synthetic Dataset Generation and Transformer-Based Concept Extraction

Authors: Andrey Khalov

Abstract:

The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.

Keywords: ontology expansion, synthetic dataset, transformer fine-tuning, concept extraction, DOLCE, BERT, taxonomy, LLM, NER

Procedia PDF Downloads 12
9361 Using Historical Data for Stock Prediction

Authors: Sofia Stoica

Abstract:

In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.

Keywords: finance, machine learning, opening price, stock market

Procedia PDF Downloads 187
9360 Problems and Prospects of an Intelligent Investment in Kazakh Society

Authors: Sultanbayeva Gulmira Serikbayevna, Golovchun Aleftina Anatolyevna

Abstract:

The development of any nation is directly related to the development of human capital in it. A human development is an increase its intellectual potential, its compliance with the requirements of time, present and future society. Demands of globalization cannot limit the processes of national traditions. The education system must be formed on the basis of international practice of cultural development. In Kazakhstan, where modernization changes are rapidly developing, the education system should be formed in two ways: first, on a national basis, and secondly, based on global best practices. There is the need to recognize and promote the importance of education as a value. The world community considers the problem of spiritual values. Along with individual values, spiritual values are also universal values. Formation of values such as the presence in young people a sense of respect for their homeland, social responsibility, respect the culture and traditions of its people is the most important task than the possession of material goods. When forming the intellectual nation, values in the field of education and science become investments for the development of the society, as well as education and science today transformed into the most important capital.

Keywords: human capital, humanitarian technology, intangible assets, intelligent nation, society of knowledge

Procedia PDF Downloads 317
9359 Downscaling Grace Gravity Models Using Spectral Combination Techniques for Terrestrial Water Storage and Groundwater Storage Estimation

Authors: Farzam Fatolazadeh, Kalifa Goita, Mehdi Eshagh, Shusen Wang

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) is a satellite mission with twin satellites for the precise determination of spatial and temporal variations in the Earth’s gravity field. The products of this mission are monthly global gravity models containing the spherical harmonic coefficients and their errors. These GRACE models can be used for estimating terrestrial water storage (TWS) variations across the globe at large scales, thereby offering an opportunity for surface and groundwater storage (GWS) assessments. Yet, the ability of GRACE to monitor changes at smaller scales is too limited for local water management authorities. This is largely due to the low spatial and temporal resolutions of its models (~200,000 km2 and one month, respectively). High-resolution GRACE data products would substantially enrich the information that is needed by local-scale decision-makers while offering the data for the regions that lack adequate in situ monitoring networks, including northern parts of Canada. Such products could eventually be obtained through downscaling. In this study, we extended the spectral combination theory to simultaneously downscale spatiotemporally the 3o spatial coarse resolution of GRACE to 0.25o degrees resolution and monthly coarse resolution to daily resolution. This method combines the monthly gravity field solution of GRACE and daily hydrological model products in the form of both low and high-frequency signals to produce high spatiotemporal resolution TWSA and GWSA products. The main contribution and originality of this study are to comprehensively and simultaneously consider GRACE and hydrological variables and their uncertainties to form the estimator in the spectral domain. Therefore, it is predicted that we reach downscale products with an acceptable accuracy.

Keywords: GRACE satellite, groundwater storage, spectral combination, terrestrial water storage

Procedia PDF Downloads 82
9358 The Legal and Regulatory Gaps of Blockchain-Enabled Energy Prosumerism

Authors: Karisma Karisma, Pardis Moslemzadeh Tehrani

Abstract:

This study aims to conduct a high-level strategic dialogue on the lack of consensus, consistency, and legal certainty regarding blockchain-based energy prosumerism so that appropriate institutional and governance structures can be put in place to address the inadequacies and gaps in the legal and regulatory framework. The drive to achieve national and global decarbonization targets is a driving force behind climate goals and policies under the Paris Agreement. In recent years, efforts to ‘demonopolize’ and ‘decentralize’ energy generation and distribution have driven the energy transition toward decentralized systems, invoking concepts such as ownership, sovereignty, and autonomy of RE sources. The emergence of individual and collective forms of prosumerism and the rapid diffusion of blockchain is expected to play a critical role in the decarbonization and democratization of energy systems. However, there is a ‘regulatory void’ relating to individual and collective forms of prosumerism that could prevent the rapid deployment of blockchain systems and potentially stagnate the operationalization of blockchain-enabled energy sharing and trading activities. The application of broad and facile regulatory fixes may be insufficient to address the major regulatory gaps. First, to the authors’ best knowledge, the concepts and elements circumjacent to individual and collective forms of prosumerism have not been adequately described in the legal frameworks of many countries. Second, there is a lack of legal certainty regarding the creation and adaptation of business models in a highly regulated and centralized energy system, which inhibits the emergence of prosumer-driven niche markets. There are also current and prospective challenges relating to the legal status of blockchain-based platforms for facilitating energy transactions, anticipated with the diffusion of blockchain technology. With the rise of prosumerism in the energy sector, the areas of (a) network charges, (b) energy market access, (c) incentive schemes, (d) taxes and levies, and (e) licensing requirements are still uncharted territories in many countries. The uncertainties emanating from this area pose a significant hurdle to the widespread adoption of blockchain technology, a complementary technology that offers added value and competitive advantages for energy systems. The authors undertake a conceptual and theoretical investigation to elucidate the lack of consensus, consistency, and legal certainty in the study of blockchain-based prosumerism. In addition, the authors set an exploratory tone to the discussion by taking an analytically eclectic approach that builds on multiple sources and theories to delve deeper into this topic. As an interdisciplinary study, this research accounts for the convergence of regulation, technology, and the energy sector. The study primarily adopts desk research, which examines regulatory frameworks and conceptual models for crucial policies at the international level to foster an all-inclusive discussion. With their reflections and insights into the interaction of blockchain and prosumerism in the energy sector, the authors do not aim to develop definitive regulatory models or instrument designs, but to contribute to the theoretical dialogue to navigate seminal issues and explore different nuances and pathways. Given the emergence of blockchain-based energy prosumerism, identifying the challenges, gaps and fragmentation of governance regimes is key to facilitating global regulatory transitions.

Keywords: blockchain technology, energy sector, prosumer, legal and regulatory.

Procedia PDF Downloads 180
9357 Period Poverty: An Analysis of Sustainable Solutions to a Global Problem

Authors: Antonella Regueiro Fernandez

Abstract:

This paper examines the issue of period poverty and the innovative approaches – or lack thereof – that national systems are using to tackle the issue. Through a systems-thinking and economical approach, the paper analyzes the intricate relationship between proper systemic change and sustainable innovations for this global problem. The first part of the research introduces period poverty and the lack of sustainable options currently in place to resolve the issue. The second part delves into a comparison of existing technologies – single-use and reusable period products -- and their benefits and deficiencies. It also provides a comparison of two countries and their existing solutions landscape (Scotland and the United States), arguing that while Scotland has provided an innovative national solution to the problem, it still lacks a proper assessment of the issue of sustainability, while the United States continues to lag in offering any holistic solution at all. The last part provides a conclusion to the research and affirms the importance of holistic policymaking approaches to issues of period poverty, which have the potential to truly benefit half of the world’s population while also encouraging environmental preservation.

Keywords: gender equity, sustainability, period poverty, SDGs, sustainable development

Procedia PDF Downloads 120
9356 The Factors Affecting the Use of Massive Open Online Courses in Blended Learning by Lecturers in Universities

Authors: Taghreed Alghamdi, Wendy Hall, David Millard

Abstract:

Massive Open Online Courses (MOOCs) have recently gained widespread interest in the academic world, starting a wide range of discussion of a number of issues. One of these issues, using MOOCs in teaching and learning in the higher education by integrating MOOCs’ contents with traditional face-to-face activities in blended learning format, is called blended MOOCs (bMOOCs) and is intended not to replace traditional learning but to enhance students learning. Most research on MOOCs has focused on students’ perception and institutional threats whereas there is a lack of published research on academics’ experiences and practices. Thus, the first aim of the study is to develop a classification of blended MOOCs models by conducting a systematic literature review, classifying 19 different case studies, and identifying the broad types of bMOOCs models namely: Supplementary Model and Integrated Model. Thus, the analyses phase will emphasize on these different types of bMOOCs models in terms of adopting MOOCs by lecturers. The second aim of the study is to improve the understanding of lecturers’ acceptance of bMOOCs by investigate the factors that influence academics’ acceptance of using MOOCs in traditional learning by distributing an online survey to lecturers who participate in MOOCs platforms. These factors can help institutions to encourage their lecturers to integrate MOOCs with their traditional courses in universities.

Keywords: acceptance, blended learning, blended MOOCs, higher education, lecturers, MOOCs, professors

Procedia PDF Downloads 130
9355 Assessment of Pre-Processing Influence on Near-Infrared Spectra for Predicting the Mechanical Properties of Wood

Authors: Aasheesh Raturi, Vimal Kothiyal, P. D. Semalty

Abstract:

We studied mechanical properties of Eucalyptus tereticornis using FT-NIR spectroscopy. Firstly, spectra were pre-processed to eliminate useless information. Then, prediction model was constructed by partial least squares regression. To study the influence of pre-processing on prediction of mechanical properties for NIR analysis of wood samples, we applied various pretreatment methods like straight line subtraction, constant offset elimination, vector-normalization, min-max normalization, multiple scattering. Correction, first derivative, second derivatives and their combination with other treatment such as First derivative + straight line subtraction, First derivative+ vector normalization and First derivative+ multiplicative scattering correction. The data processing methods in combination of preprocessing with different NIR regions, RMSECV, RMSEP and optimum factors/rank were obtained by optimization process of model development. More than 350 combinations were obtained during optimization process. More than one pre-processing method gave good calibration/cross-validation and prediction/test models, but only the best calibration/cross-validation and prediction/test models are reported here. The results show that one can safely use NIR region between 4000 to 7500 cm-1 with straight line subtraction, constant offset elimination, first derivative and second derivative preprocessing method which were found to be most appropriate for models development.

Keywords: FT-NIR, mechanical properties, pre-processing, PLS

Procedia PDF Downloads 357
9354 Economic Development Impacts of Connected and Automated Vehicles (CAV)

Authors: Rimon Rafiah

Abstract:

This paper will present a combination of two seemingly unrelated models, which are the one for estimating economic development impacts as a result of transportation investment and the other for increasing CAV penetration in order to reduce congestion. Measuring economic development impacts resulting from transportation investments is becoming more recognized around the world. Examples include the UK’s Wider Economic Benefits (WEB) model, Economic Impact Assessments in the USA, various input-output models, and additional models around the world. The economic impact model is based on WEB and is based on the following premise: investments in transportation will reduce the cost of personal travel, enabling firms to be more competitive, creating additional throughput (the same road allows more people to travel), and reducing the cost of travel of workers to a new workplace. This reduction in travel costs was estimated in out-of-pocket terms in a given localized area and was then translated into additional employment based on regional labor supply elasticity. This additional employment was conservatively assumed to be at minimum wage levels, translated into GDP terms, and from there into direct taxation (i.e., an increase in tax taken by the government). The CAV model is based on economic principles such as CAV usage, supply, and demand. Usage of CAVs can increase capacity using a variety of means – increased automation (known as Level I thru Level IV) and also by increased penetration and usage, which has been predicted to go up to 50% by 2030 according to several forecasts, with possible full conversion by 2045-2050. Several countries have passed policies and/or legislation on sales of gasoline-powered vehicles (none) starting in 2030 and later. Supply was measured via increased capacity on given infrastructure as a function of both CAV penetration and implemented technologies. The CAV model, as implemented in the USA, has shown significant savings in travel time and also in vehicle operating costs, which can be translated into economic development impacts in terms of job creation, GDP growth and salaries as well. The models have policy implications as well and can be adapted for use in Japan as well.

Keywords: CAV, economic development, WEB, transport economics

Procedia PDF Downloads 72
9353 Liquid-Liquid Equilibrium Study in Solvent Extraction of o-Cresol from Coal Tar

Authors: Dewi Selvia Fardhyanti, Astrilia Damayanti

Abstract:

Coal tar is a liquid by-product of the process of coal gasification and carbonation, also in some industries such as steel, power plant, cement, and others. This liquid oil mixture contains various kinds of useful compounds such as aromatic compounds and phenolic compounds. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research investigates thermodynamic modelling of liquid-liquid equilibria (LLE) in solvent extraction of o-Cresol from the coal tar. The equilibria are modeled by ternary components of Wohl, Van Laar, and Three-Suffix Margules models. The values of the parameters involved are obtained by curve-fitting to the experimental data. Based on the comparison between calculated and experimental data, it turns out that among the three models studied, the Three-Suffix Margules seems to be the best to predict the LLE of o-Cresol for those system.

Keywords: coal tar, o-Cresol, Wohl, Van Laar, three-suffix margules

Procedia PDF Downloads 275
9352 AutoML: Comprehensive Review and Application to Engineering Datasets

Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili

Abstract:

The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.

Keywords: automated machine learning, uncertainty, engineering dataset, regression

Procedia PDF Downloads 60
9351 Effectiveness of the Resistance to Irradiance Test on Sunglasses Standards

Authors: Mauro Masili, Liliane Ventura

Abstract:

It is still controversial in the literature the ultraviolet (UV) radiation effects on the ocular media, but the World Health Organization has established safe limits on the exposure of eyes to UV radiation based on reports in literature. Sunglasses play an important role in providing safety, and their lenses should provide adequate UV filters. Regarding UV protection for ocular media, the resistance-to-irradiance test for sunglasses under many national standards requires irradiating lenses for 50 uninterrupted hours with a 450 W solar simulator. This artificial aging test may provide a corresponding evaluation of exposure to the sun. Calculating the direct and diffuse solar irradiance at a vertical surface and the corresponding radiant exposure for the entire year, we compare the latter with the 50-hour radiant exposure of a 450 W xenon arc lamp from a solar simulator required by national standards. Our calculations indicate that this stress test is ineffective in its present form. We provide evidence of the need to re-evaluate the parameters of the tests to establish appropriate safe limits against UV radiation. This work is potentially significant for scientists and legislators in the field of sunglasses standards to improve the requirements of sunglasses quality and safety.

Keywords: ISO 12312-1, solar simulator, sunglasses standards, UV protection

Procedia PDF Downloads 196