Search results for: feature combination
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4492

Search results for: feature combination

3832 Representation of Memory of Forced Displacement in Central and Eastern Europe after World War II in Polish and German Cinemas

Authors: Ilona Copik

Abstract:

The aim of this study is to analyze the representation of memories of the forced displacement of Poles and Germans from the eastern territories in 1945 as depicted by Polish and German feature films between the years 1945-1960. The aftermath of World War II and the Allied agreements concluded at Yalta and Potsdam (1945) resulted in changes in national borders in Central and Eastern Europe and the large-scale transfer of civilians. The westward migration became a symbol of the new post-war division of Europe, new spheres of influence separated by the Iron Curtain. For years it was a controversial topic in both Poland and Germany due to the geopolitical alignment (the socialist East and capitalist West of Europe), as well as the unfinished debate between the victims and perpetrators of the war. The research premise is to take a comparative view of the conflicted cultures of Polish and German memory, to reflect on the possibility of an international dialogue about the past recorded in film images, and to discover the potential of film as a narrative warning against totalitarian inclinations. Until now, films made between 1945 and 1960 in Poland and the German occupation zones have been analyzed mainly in the context of artistic strategies subordinated to ideology and historical politics. In this study, the intention is to take a critical approach leading to the recognition of how films work as collective memory media, how they reveal the mechanisms of memory/forgetting, and what settlement topoi and migration myths they contain. The main hypothesis is that feature films about forced displacement, in addition to the politics of history - separate in each country - reveal comparable transnational individual experiences: the chaos of migration, the trauma of losing one's home, the conflicts accompanying the familiar/foreign, the difficulty of cultural adaptation, the problem of lost identity, etc.

Keywords: forced displacement, Polish and German cinema, war victims, World War II

Procedia PDF Downloads 61
3831 Gas Transmission Pipeline Integrity Management System Through Corrosion Mitigation and Inspection Strategy: A Case Study of Natural Gas Transmission Pipeline from Wafa Field to Mellitah Gas Plant in Libya

Authors: Osama Sassi, Manal Eltorki, Iftikhar Ahmad

Abstract:

Poor integrity is one of the major causes of leaks and accidents in gas transmission pipelines. To ensure safe operation, it is must to have efficient and effective pipeline integrity management (PIM) system. The corrosion management is one of the important aspects of successful pipeline integrity management program together design, material selection, operations, risk evaluation and communication aspects to maintain pipelines in a fit-for-service condition. The objective of a corrosion management plan is to design corrosion mitigation, monitoring, and inspection strategy, and for maintenance in a timely manner. This paper presents the experience of corrosion management of a gas transmission pipeline from Wafa field to Mellitah gas plant in Libya. The pipeline is 525.5 km long and having 32 inches diameter. It is a buried pipeline. External corrosion on pipeline is controlled with a combination of coatings and cathodic protection while internal corrosion is controlled with a combination of chemical inhibitors, periodic cleaning and process control. The monitoring and inspection techniques provide a way to measure the effectiveness of corrosion control systems and provide an early warning when changing conditions may be causing a corrosion problem. This paper describes corrosion management system used in Mellitah Oil & Gas BV for its gas transmission pipeline based on standard practices of corrosion mitigation and inspection.

Keywords: corrosion mitigation on gas transmission pipelines, pipeline integrity management, corrosion management of gas pipelines, prevention and inspection of corrosion

Procedia PDF Downloads 61
3830 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 49
3829 A Study on the Effect of Design Factors of Slim Keyboard’s Tactile Feedback

Authors: Kai-Chieh Lin, Chih-Fu Wu, Hsiang Ling Hsu, Yung-Hsiang Tu, Chia-Chen Wu

Abstract:

With the rapid development of computer technology, the design of computers and keyboards moves towards a trend of slimness. The change of mobile input devices directly influences users’ behavior. Although multi-touch applications allow entering texts through a virtual keyboard, the performance, feedback, and comfortableness of the technology is inferior to traditional keyboard, and while manufacturers launch mobile touch keyboards and projection keyboards, the performance has not been satisfying. Therefore, this study discussed the design factors of slim pressure-sensitive keyboards. The factors were evaluated with an objective (accuracy and speed) and a subjective evaluation (operability, recognition, feedback, and difficulty) depending on the shape (circle, rectangle, and L-shaped), thickness (flat, 3mm, and 6mm), and force (35±10g, 60±10g, and 85±10g) of the keyboard. Moreover, MANOVA and Taguchi methods (regarding signal-to-noise ratios) were conducted to find the optimal level of each design factor. The research participants, by their typing speed (30 words/ minute), were divided in two groups. Considering the multitude of variables and levels, the experiments were implemented using the fractional factorial design. A representative model of the research samples were established for input task testing. The findings of this study showed that participants with low typing speed primarily relied on vision to recognize the keys, and those with high typing speed relied on tactile feedback that was affected by the thickness and force of the keys. In the objective and subjective evaluation, a combination of keyboard design factors that might result in higher performance and satisfaction was identified (L-shaped, 3mm, and 60±10g) as the optimal combination. The learning curve was analyzed to make a comparison with a traditional standard keyboard to investigate the influence of user experience on keyboard operation. The research results indicated the optimal combination provided input performance to inferior to a standard keyboard. The results could serve as a reference for the development of related products in industry and for applying comprehensively to touch devices and input interfaces which are interacted with people.

Keywords: input performance, mobile device, slim keyboard, tactile feedback

Procedia PDF Downloads 291
3828 Evaluation of Chemical Compositions and Biological Activities of Five Essential Oils

Authors: G. Ozturk, B. Demirci

Abstract:

It is well known that essential oils used for therapeutic purposes for many years. In this study, five different Pharmacopoeia grade essential oils (Achillea millefolium L., Pimpinella anisum L., Matricaria recutita L., Eucalyptus globulus L., Salvia officinalis L.) which obtained from commercial sources were evaluated for chemical compositions, synergistic antimicrobial activities, and lipoxygenase enzyme inhibitions. Volatile components were determined by gas chromatography/flame ionization detector and gas chromatography/mass spectrometer, simultaneously. The potential antimicrobial activity of essential oils was tested against oral pathogenic standard strains such as Streptococcus mutans, Streptococcus sanguinis, Staphylococcus aureus, Corynebacterium striatum, Candida albicans and Candida krusei by broth microdilution methods. Ciprofloxacin and ketoconazole were used positive controls. It has been observed that the essential oils tested have average inhibitory antimicrobial activity against oral pathogens with a Minimum Inhibition Concentration of 20-0.625 mg/mL. The active essential oils have been combined with antibiotics and synergistic effects have been evaluated by Checkerboard method. ƩFIC values were determined. In combination with antibiotics M. recutita essential oil has been shown to have a synergistic effect against S. aureus in combination with tetracycline (ƩFIC 0.46). In addition, 5-LOX inhibitory activity was measured by modifying the spectrophotometric method developed by Baylac and Racine. As a result, 5-LOX % inhibition of S. officinalis, E. globulus and M. recutita were calculated as 34.0 ± 6.66, 72.7 ± 2.78 and 27.7 ± 0.60, respectively.

Keywords: antimicrobial activity, essential oils, synergistic activity, 5-lipoxygenase inhibition

Procedia PDF Downloads 95
3827 Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 58
3826 General Architecture for Automation of Machine Learning Practices

Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain

Abstract:

Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.

Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler

Procedia PDF Downloads 42
3825 The Role of Vernacular Radio Stations in Enhancing Agricultural Development in Kenya; A Case of KASS FM

Authors: Thomas Kipkurgat, Silahs Chemwaina

Abstract:

Communication and ICT is a crucial component in realization of vision 2030, radio has played a key role in dissemination of information to mass audience. Since time immemorial, mass media has played a vital role in passing information on agricultural development issues both locally and internationally. This paper aimed at assessing the role of community radio stations in enhancing agricultural development in Kenya. The paper sought to identify the main contributions of KASS FM radio in the agricultural development especially in rural areas, the study also aimed to establish the appropriate adjustments in editorial policies of KASS FM radio in helping to promote agricultural development related programmes in rural areas. Despite some weaknesses in radio programming and the mode of interaction with the rural people, the findings of this study showed that the rural communities are better off today than in the old days when FM radios were non-existent. KASS FM has come up with different developmental programmes that have positively contributed to changing the rural people’s ways of life. These programmes include farming, health, marital values, environment, cultural issues, human rights, democracy, religious teachings, peace and reconciliation. Such programmes feature experts, professionals and opinion leaders who address numerous topics of interest to the community. The local people participate in the production of these programmes through letters to the editor, and phone-ins, among others. Programmes such as political talk shows, which feature in KASS FM, has become one of the most important ways of community participation. The interpretation and conclusions are based on the empirical data analysis and the theories of development advanced by international development communication scholars, as presented in the paper. The study ends with some recommendations on how KASS FM can best serve the interests of the poor people in rural areas, and helps improve their lives.

Keywords: agriculture, development, communication, KASS FM, radio, rural areas, Kenya

Procedia PDF Downloads 279
3824 Speech Emotion Recognition: A DNN and LSTM Comparison in Single and Multiple Feature Application

Authors: Thiago Spilborghs Bueno Meyer, Plinio Thomaz Aquino Junior

Abstract:

Through speech, which privileges the functional and interactive nature of the text, it is possible to ascertain the spatiotemporal circumstances, the conditions of production and reception of the discourse, the explicit purposes such as informing, explaining, convincing, etc. These conditions allow bringing the interaction between humans closer to the human-robot interaction, making it natural and sensitive to information. However, it is not enough to understand what is said; it is necessary to recognize emotions for the desired interaction. The validity of the use of neural networks for feature selection and emotion recognition was verified. For this purpose, it is proposed the use of neural networks and comparison of models, such as recurrent neural networks and deep neural networks, in order to carry out the classification of emotions through speech signals to verify the quality of recognition. It is expected to enable the implementation of robots in a domestic environment, such as the HERA robot from the RoboFEI@Home team, which focuses on autonomous service robots for the domestic environment. Tests were performed using only the Mel-Frequency Cepstral Coefficients, as well as tests with several characteristics of Delta-MFCC, spectral contrast, and the Mel spectrogram. To carry out the training, validation and testing of the neural networks, the eNTERFACE’05 database was used, which has 42 speakers from 14 different nationalities speaking the English language. The data from the chosen database are videos that, for use in neural networks, were converted into audios. It was found as a result, a classification of 51,969% of correct answers when using the deep neural network, when the use of the recurrent neural network was verified, with the classification with accuracy equal to 44.09%. The results are more accurate when only the Mel-Frequency Cepstral Coefficients are used for the classification, using the classifier with the deep neural network, and in only one case, it is possible to observe a greater accuracy by the recurrent neural network, which occurs in the use of various features and setting 73 for batch size and 100 training epochs.

Keywords: emotion recognition, speech, deep learning, human-robot interaction, neural networks

Procedia PDF Downloads 149
3823 Predictors of Response to Interferone Therapy in Chronic Hepatitis C Virus Infection

Authors: Ali Kassem, Ehab Fawzy, Mahmoud Sef el-eslam, Fatma Salah- Eldeen, El zahraa Mohamed

Abstract:

Introduction: The combination of interferon (INF) and ribavirin is the preferred treatment for chronic hepatitis C viral (HCV) infection. However, nonresponse to this therapy remains common and is associated with several factors such as HCV genotype and HCV viral load in addition to host factors such as sex, HLA type and cytokine polymorphisms. Aim of the work: The aim of this study was to determine predictors of response to (INF) therapy in chronic HCV infected patients treated with INF alpha and ribavirin combination therapy. Patients and Methods: The present study included 110 patients (62 males, 48 females) with chronic HCV infection. Their ages ranged from 20-59 years. Inclusion criteria were organized according to the protocol of the Egyptian National Committee for control of viral hepatitis. Patients included in this study were recruited to receive INF ribavirin combination therapy; 54 patients received pegylated NF α-2a (180 μg) and weight based ribavirin therapy (1000 mg if < 75 kg, 1200 mg if > 75 kg) for 48 weeks and 53 patients received pegylated INF α-2b (1.5 ug/kg/week) and weight based ribavirin therapy (800 mg if < 65 kg, 1000 mg if 65-75 kg and 1200 mg if > 75kg). One hundred and seven liver biopsies were included in the study and submitted to histopathological examination. Hematoxylin and eosin (H&E) stained sections were done to assess both the grade and the stage of chronic viral hepatitis, in addition to the degree of steatosis. Modified hepatic activity index (HAI) grading, modified Ishak staging and Metavir grading and staging systems were used. Laboratory follow up including: HCV PCR at the 12th week to assess the early virologic response (EVR) and at the 24th week were done. At the end of the course: HCV PCR was done at the end of the course and tested 6 months later to document end virologic response (ETR) and sustained virologic response (SVR) respectively. Results One hundred seven patients; 62 males (57.9 %) and 45 females (42.1%) completed the course and included in this study. The age of patients ranged from 20-59 years with a mean of 40.39±10.03 years. Six months after the end of treatment patients were categorized into two groups: Group (1): patients who achieved sustained virological response (SVR). Group (2): patients who didn't achieve sustained virological response (non SVR) including non-responders, breakthrough and relapsers. In our study, 58 (54.2%) patients showed SVR, 18 (16.8%) patients were non-responders, 15 (14%) patients showed break-through and 16 (15 %) patients were relapsers. Univariate binary regression analysis of the possible risk factors of non SVR showed that the significant factors were higher age, higher fasting insulin level, higher Metavir stage and higher grade of hepatic steatosis. Multivariate binary regression analysis showed that the only independent risk factor for non SVR was high fasting insulin level. Conclusion: Younger age, lower Metavir stage, lower steatosis grade and lower fasting insulin level are good predictors of SVR and could be used in predicting the treatment response of pegylated interferon/ribavirin therapy.

Keywords: chronic HCV infection, interferon ribavirin combination therapy, predictors to antiviral therapy, treatment response

Procedia PDF Downloads 383
3822 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software

Authors: Carlos Gonzalez

Abstract:

This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.

Keywords: internet, secure software, threats, cryptography process

Procedia PDF Downloads 314
3821 Online Dietary Management System

Authors: Kyle Yatich Terik, Collins Oduor

Abstract:

The current healthcare system has made healthcare more accessible and efficient by the use of information technology through the implementation of computer algorithms that generate menus based on the diagnosis. While many systems just like these have been created over the years, their main objective is to help healthy individuals calculate their calorie intake and assist them by providing food selections based on a pre-specified calorie. That application has been proven to be useful in some ways, and they are not suitable for monitoring, planning, and managing hospital patients, especially that critical condition their dietary needs. The system also addresses a number of objectives, such as; the main objective is to be able to design, develop and implement an efficient, user-friendly as well as and interactive dietary management system. The specific design development objectives include developing a system that will facilitate a monitoring feature for users using graphs, developing a system that will provide system-generated reports to the users, dietitians, and system admins, design a system that allows users to measure their BMI (Body Mass Index), the system will also provide food template feature that will guide the user on a balanced diet plan. In order to develop the system, further research was carried out in Kenya, Nairobi County, using online questionnaires being the preferred research design approach. From the 44 respondents, one could create discussions such as the major challenges encountered from the manual dietary system, which include no easily accessible information of the calorie intake for food products, expensive to physically visit a dietitian to create a tailored diet plan. Conclusively, the system has the potential of improving the quality of life of people as a whole by providing a standard for healthy living and allowing individuals to have readily available knowledge through food templates that will guide people and allow users to create their own diet plans that consist of a balanced diet.

Keywords: DMS, dietitian, patient, administrator

Procedia PDF Downloads 146
3820 Degradation of Amitriptyline Hydrochloride, Methyl Salicylate and 2-Phenoxyethanol in Water Systems by the Combination UV/Cl2

Authors: F. Javier Benitez, Francisco J. Real, Juan Luis Acero, Francisco Casas

Abstract:

Three emerging contaminants (amitriptyline hydrochloride, methyl salicylate and 2-phenoxyethanol) frequently found in waste-waters were selected to be individually degraded in ultra-pure water by the combined advanced oxidation process constituted by UV radiation and chlorine. The influence of pH, initial chlorine concentration and nature of the contaminants was firstly explored. The trend for the reactivity of the selected compounds was deduced: amitriptyline hydrochloride > methyl salicylate > 2-phenoxyethanol. A later kinetic study was carried out and focused on the specific evaluation of the first-order rate constants and the determination of the partial contribution to the global reaction of the direct photochemical pathway and the radical pathway. A comparison between the rate constant values among photochemical experiments without and with the presence of Cl2 reveals a clear increase in the oxidation efficiency of the combined process with respect to the photochemical reaction alone. In a second stage, the simultaneous oxidation of mixtures of the selected contaminants in several types of water (ultrapure water, surface water from a reservoir, and two secondary effluents) was also performed by the same combination UV/Cl2 under more realistic operating conditions. The efficiency of this combined system UV/Cl2 was compared to other oxidants such as the UV/S2O82- and UV/H2O2 AOPs. Results confirmed that the UV/Cl2 system provides higher elimination efficiencies among the AOPs tested.

Keywords: emerging contaminants, UV/chlorine advanced oxidation process, amitriptyline, methyl salicylate, 2-phenoxyethanol, chlorination, photolysis

Procedia PDF Downloads 325
3819 A Discussion on Urban Planning Methods after Globalization within the Context of Anticipatory Systems

Authors: Ceylan Sozer, Ece Ceylan Baba

Abstract:

The reforms and changes that began with industrialization in cities and continued with globalization in 1980’s, created many changes in urban environments. City centers which are desolated due to industrialization, began to get crowded with globalization and became the heart of technology, commerce and social activities. While the immediate and intense alterations are planned around rigorous visions in developed countries, several urban areas where the processes were underestimated and not taken precaution faced with irrevocable situations. When the effects of the globalization in the cities are examined, it is seen that there are some anticipatory system plans in the cities about the future problems. Several cities such as New York, London and Tokyo have planned to resolve probable future problems in a systematic scheme to decrease possible side effects during globalization. The decisions in urban planning and their applications are the main points in terms of sustainability and livability in such mega-cities. This article examines the effects of globalization on urban planning through 3 mega cities and the applications. When the applications of urban plannings of the three mega-cities are investigated, it is seen that the city plans are generated under light of past experiences and predictions of a certain future. In urban planning, past and present experiences of a city should have been examined and then future projections could be predicted together with current world dynamics by a systematic way. In this study, methods used in urban planning will be discussed and ‘Anticipatory System’ model will be explained and relations with global-urban planning will be discussed. The concept of ‘anticipation’ is a phenomenon that means creating foresights and predictions about the future by combining past, present and future within an action plan. The main distinctive feature that separates anticipatory systems from other systems is the combination of past, present and future and concluding with an act. Urban plans that consist of various parameters and interactions together are identified as ‘live’ and they have systematic integrities. Urban planning with an anticipatory system might be alive and can foresight some ‘side effects’ in design processes. After globalization, cities became more complex and should be designed within an anticipatory system model. These cities can be more livable and can have sustainable urban conditions for today and future.In this study, urban planning of Istanbul city is going to be analyzed with comparisons of New York, Tokyo and London city plans in terms of anticipatory system models. The lack of a system in İstanbul and its side effects will be discussed. When past and present actions in urban planning are approached through an anticipatory system, it can give more accurate and sustainable results in the future.

Keywords: globalization, urban planning, anticipatory system, New York, London, Tokyo, Istanbul

Procedia PDF Downloads 138
3818 GAILoc: Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 54
3817 A Randomised Controlled Study to Compare Efficacy and Safety of Bupivacaine plus Dexamethasone Versus Bupivacaine plus Fentanyl for Caudal Block in Children

Authors: Ashwini Patil

Abstract:

Caudal block is one of the most commonly used regional anesthetic techniques in children. Currently, fentanyl is used as an adjuvant to bupivacaine to prolong analgesia but fentanyl is a narcotic. Dexamethasone, a glucocorticoid with strong anti-inflammatory effects provides improvement in post-operative analgesia and post-operative side effects. However, its analgesic efficacy and safety in comparison with fentanyl has not been extensively studied. So the objective of this randomized controlled study is to compare dexamethasone with fentanyl as an adjuvant to bupivacaine for caudal block in children in relation to the duration of caudal analgesia, post-operative analgesic requirement and incidence of post-operative nausea and vomiting. This study included 100 children, aged 1–6 years, undergoing lower abdominal surgeries. Patients were randomized into two groups, 50 each to receive a combination of dexamethasone 0.2 mg/kg along with 1 ml/kg bupivacaine 0.25% (group A) or combination of fentanyl (1 ug/kg) along with 1ml/kg bupivacaine 0.25% (group B). In the post-operative period, pain was assessed using a Modified Objective Pain Scale (MOPS) until 12 hr after surgery and rescue analgesia is administered when MOPS score 4 or more is recorded. Residual motor block, number of analgesic doses required within 24 hr after surgery, sedation scores, intra-operative and post-operative hemodynamic variables, post-operative nausea and vomiting (PONV), and other adverse effects were recorded. Data is analysed using unpaired t test and Significance level of P< 0.05 is considered statistically significant. Group A showed a significantly longer time to first analgesic requirement than group B (p<0.05). The number of rescue analgesic doses required in the first 24 h was significantly less in group A (p<0.05). Group A showed significantly lower MOPS scores than group B(p<0.05). Intra-operative and post-operative hemodynamic variables, Modified Bromage Scale scores, and sedation scores were comparable in both the groups. Group A showed significantly fewer incidences of PONV compared with group B(p<0.05). This study reveals that adding dexamethasone to bupivacaine prolongs the duration of postoperative analgesia and decreases the incidence of PONV as compared to combination of fentanyl to bupivacaine after a caudal block in pediatric patients.

Keywords: bupivacaine, caudal analgesia, dexamethasone, pediatric

Procedia PDF Downloads 196
3816 Implementation of Correlation-Based Data Analysis as a Preliminary Stage for the Prediction of Geometric Dimensions Using Machine Learning in the Forming of Car Seat Rails

Authors: Housein Deli, Loui Al-Shrouf, Hammoud Al Joumaa, Mohieddine Jelali

Abstract:

When forming metallic materials, fluctuations in material properties, process conditions, and wear lead to deviations in the component geometry. Several hundred features sometimes need to be measured, especially in the case of functional and safety-relevant components. These can only be measured offline due to the large number of features and the accuracy requirements. The risk of producing components outside the tolerances is minimized but not eliminated by the statistical evaluation of process capability and control measurements. The inspection intervals are based on the acceptable risk and are at the expense of productivity but remain reactive and, in some cases, considerably delayed. Due to the considerable progress made in the field of condition monitoring and measurement technology, permanently installed sensor systems in combination with machine learning and artificial intelligence, in particular, offer the potential to independently derive forecasts for component geometry and thus eliminate the risk of defective products - actively and preventively. The reliability of forecasts depends on the quality, completeness, and timeliness of the data. Measuring all geometric characteristics is neither sensible nor technically possible. This paper, therefore, uses the example of car seat rail production to discuss the necessary first step of feature selection and reduction by correlation analysis, as otherwise, it would not be possible to forecast components in real-time and inline. Four different car seat rails with an average of 130 features were selected and measured using a coordinate measuring machine (CMM). The run of such measuring programs alone takes up to 20 minutes. In practice, this results in the risk of faulty production of at least 2000 components that have to be sorted or scrapped if the measurement results are negative. Over a period of 2 months, all measurement data (> 200 measurements/ variant) was collected and evaluated using correlation analysis. As part of this study, the number of characteristics to be measured for all 6 car seat rail variants was reduced by over 80%. Specifically, direct correlations for almost 100 characteristics were proven for an average of 125 characteristics for 4 different products. A further 10 features correlate via indirect relationships so that the number of features required for a prediction could be reduced to less than 20. A correlation factor >0.8 was assumed for all correlations.

Keywords: long-term SHM, condition monitoring, machine learning, correlation analysis, component prediction, wear prediction, regressions analysis

Procedia PDF Downloads 22
3815 Co-Creational Model for Blended Learning in a Flipped Classroom Environment Focusing on the Combination of Coding and Drone-Building

Authors: A. Schuchter, M. Promegger

Abstract:

The outbreak of the COVID-19 pandemic has shown us that online education is so much more than just a cool feature for teachers – it is an essential part of modern teaching. In online math teaching, it is common to use tools to share screens, compute and calculate mathematical examples, while the students can watch the process. On the other hand, flipped classroom models are on the rise, with their focus on how students can gather knowledge by watching videos and on the teacher’s use of technological tools for information transfer. This paper proposes a co-educational teaching approach for coding and engineering subjects with the help of drone-building to spark interest in technology and create a platform for knowledge transfer. The project combines aspects from mathematics (matrices, vectors, shaders, trigonometry), physics (force, pressure and rotation) and coding (computational thinking, block-based programming, JavaScript and Python) and makes use of collaborative-shared 3D Modeling with clara.io, where students create mathematics knowhow. The instructor follows a problem-based learning approach and encourages their students to find solutions in their own time and in their own way, which will help them develop new skills intuitively and boost logically structured thinking. The collaborative aspect of working in groups will help the students develop communication skills as well as structural and computational thinking. Students are not just listeners as in traditional classroom settings, but play an active part in creating content together by compiling a Handbook of Knowledge (called “open book”) with examples and solutions. Before students start calculating, they have to write down all their ideas and working steps in full sentences so other students can easily follow their train of thought. Therefore, students will learn to formulate goals, solve problems, and create a ready-to use product with the help of “reverse engineering”, cross-referencing and creative thinking. The work on drones gives the students the opportunity to create a real-life application with a practical purpose, while going through all stages of product development.

Keywords: flipped classroom, co-creational education, coding, making, drones, co-education, ARCS-model, problem-based learning

Procedia PDF Downloads 109
3814 Evaluation of Correct Usage, Comfort and Fit of Personal Protective Equipment in Construction Work

Authors: Anna-Lisa Osvalder, Jonas Borell

Abstract:

There are several reasons behind the use, non-use, or inadequate use of personal protective equipment (PPE) in the construction industry. Comfort and accurate size support proper use, while discomfort, misfit, and difficulties to understand how the PPEs should be handled inhibit correct usage. The need for several protective equipments simultaneously might also create problems. The purpose of this study was to analyse the correct usage, comfort, and fit of different types of PPEs used for construction work. Correct usage was analysed as guessability, i.e., human perceptions of how to don, adjust, use, and doff the equipment, and if used as intended. The PPEs tested individually or in combinations were a helmet, ear protectors, goggles, respiratory masks, gloves, protective cloths, and safety harnesses. First, an analytical evaluation was performed with ECW (enhanced cognitive walkthrough) and PUEA (predictive use error analysis) to search for usability problems and use errors during handling and use. Then usability tests were conducted to evaluate guessability, comfort, and fit with 10 test subjects of different heights and body constitutions. The tests included observations during donning, five different outdoor work tasks, and doffing. The think-aloud method, short interviews, and subjective estimations were performed. The analytical evaluation showed that some usability problems and use errors arise during donning and doffing, but with minor severity, mostly causing discomfort. A few use errors and usability problems arose for the safety harness, especially for novices, where some could lead to a high risk of severe incidents. The usability tests showed that discomfort arose for all test subjects when using a combination of PPEs, increasing over time. For instance, goggles, together with the face mask, caused pressure, chafing at the nose, and heat rash on the face. This combination also limited sight of vision. The helmet, in combination with the goggles and ear protectors, did not fit well and caused uncomfortable pressure at the temples. No major problems were found with the individual fit of the PPEs. The ear protectors, goggles, and face masks could be adjusted for different head sizes. The guessability for how to don and wear the combination of PPE was moderate, but it took some time to adjust them for a good fit. The guessability was poor for the safety harness; few clues in the design showed how it should be donned, adjusted, or worn on the skeletal bones. Discomfort occurred when the straps were tightened too much. All straps could not be adjusted for somebody's constitutions leading to non-optimal safety. To conclude, if several types of PPEs are used together, discomfort leading to pain is likely to occur over time, which can lead to misuse, non-use, or reduced performance. If people who are not regular users should wear a safety harness correctly, the design needs to be improved for easier interpretation, correct position of the straps, and increased possibilities for individual adjustments. The results from this study can be a base for re-design ideas for PPE, especially when they should be used in combinations.

Keywords: construction work, PPE, personal protective equipment, misuse, guessability, usability

Procedia PDF Downloads 75
3813 Radar Track-based Classification of Birds and UAVs

Authors: Altilio Rosa, Chirico Francesco, Foglia Goffredo

Abstract:

In recent years, the number of Unmanned Aerial Vehicles (UAVs) has significantly increased. The rapid development of commercial and recreational drones makes them an important part of our society. Despite the growing list of their applications, these vehicles pose a huge threat to civil and military installations: detection, classification and neutralization of such flying objects become an urgent need. Radar is an effective remote sensing tool for detecting and tracking flying objects, but scenarios characterized by the presence of a high number of tracks related to flying birds make especially challenging the drone detection task: operator PPI is cluttered with a huge number of potential threats and his reaction time can be severely affected. Flying birds compared to UAVs show similar velocity, RADAR cross-section and, in general, similar characteristics. Building from the absence of a single feature that is able to distinguish UAVs and birds, this paper uses a multiple features approach where an original feature selection technique is developed to feed binary classifiers trained to distinguish birds and UAVs. RADAR tracks acquired on the field and related to different UAVs and birds performing various trajectories were used to extract specifically designed target movement-related features based on velocity, trajectory and signal strength. An optimization strategy based on a genetic algorithm is also introduced to select the optimal subset of features and to estimate the performance of several classification algorithms (Neural network, SVM, Logistic regression…) both in terms of the number of selected features and misclassification error. Results show that the proposed methods are able to reduce the dimension of the data space and to remove almost all non-drone false targets with a suitable classification accuracy (higher than 95%).

Keywords: birds, classification, machine learning, UAVs

Procedia PDF Downloads 206
3812 Advancements in Predicting Diabetes Biomarkers: A Machine Learning Epigenetic Approach

Authors: James Ladzekpo

Abstract:

Background: The urgent need to identify new pharmacological targets for diabetes treatment and prevention has been amplified by the disease's extensive impact on individuals and healthcare systems. A deeper insight into the biological underpinnings of diabetes is crucial for the creation of therapeutic strategies aimed at these biological processes. Current predictive models based on genetic variations fall short of accurately forecasting diabetes. Objectives: Our study aims to pinpoint key epigenetic factors that predispose individuals to diabetes. These factors will inform the development of an advanced predictive model that estimates diabetes risk from genetic profiles, utilizing state-of-the-art statistical and data mining methods. Methodology: We have implemented a recursive feature elimination with cross-validation using the support vector machine (SVM) approach for refined feature selection. Building on this, we developed six machine learning models, including logistic regression, k-Nearest Neighbors (k-NN), Naive Bayes, Random Forest, Gradient Boosting, and Multilayer Perceptron Neural Network, to evaluate their performance. Findings: The Gradient Boosting Classifier excelled, achieving a median recall of 92.17% and outstanding metrics such as area under the receiver operating characteristics curve (AUC) with a median of 68%, alongside median accuracy and precision scores of 76%. Through our machine learning analysis, we identified 31 genes significantly associated with diabetes traits, highlighting their potential as biomarkers and targets for diabetes management strategies. Conclusion: Particularly noteworthy were the Gradient Boosting Classifier and Multilayer Perceptron Neural Network, which demonstrated potential in diabetes outcome prediction. We recommend future investigations to incorporate larger cohorts and a wider array of predictive variables to enhance the models' predictive capabilities.

Keywords: diabetes, machine learning, prediction, biomarkers

Procedia PDF Downloads 45
3811 Producing Graphical User Interface from Activity Diagrams

Authors: Ebitisam K. Elberkawi, Mohamed M. Elammari

Abstract:

Graphical User Interface (GUI) is essential to programming, as is any other characteristic or feature, due to the fact that GUI components provide the fundamental interaction between the user and the program. Thus, we must give more interest to GUI during building and development of systems. Also, we must give a greater attention to the user who is the basic corner in the dealing with the GUI. This paper introduces an approach for designing GUI from one of the models of business workflows which describe the workflow behavior of a system, specifically through activity diagrams (AD).

Keywords: activity diagram, graphical user interface, GUI components, program

Procedia PDF Downloads 448
3810 An Evaluation of Renewable Energy Sources in Green Building Systems for the Residential Sector in the Metropolis, Kolkata, India

Authors: Tirthankar Chakraborty, Indranil Mukherjee

Abstract:

The environmental aspect had a major effect on industrial decisions after the deteriorating condition of our surroundings dsince the industrial activities became apparent. Green buildings have been seen as a possible solution to reduce the carbon emissions from construction projects and the housing industry in general. Though this has been established in several areas, with many commercial buildings being designed green, the scope for expansion is still significant and further information on the importance and advantages of green buildings is necessary. Several commercial green building projects have come up and the green buildings are mainly implemented in the residential sector when the residential projects are constructed to furnish amenities to a large population. But, residential buildings, even those of medium sizes, can be designed to incorporate elements of sustainable design. In this context, this paper attempts to give a theoretical appraisal of the use of renewable energy systems in residential buildings of different sizes considering the weather conditions (solar insolation and wind speed) of the metropolis, Kolkata, India. Three cases are taken; one with solar power, one with wind power and one with a combination of the two. All the cases are considered in conjunction with conventional energy, and the efficiency of each in fulfilling the total energy demand is verified. The optimum combination for reducing the carbon footprint of the residential building is thus established. In addition, an assessment of the amount of money saved due to green buildings in metered water supply and price of coal is also mentioned.

Keywords: renewable energy, green buildings, solar power, wind power, energy hybridization, residential sector

Procedia PDF Downloads 376
3809 Voice Liveness Detection Using Kolmogorov Arnold Networks

Authors: Arth J. Shah, Madhu R. Kamble

Abstract:

Voice biometric liveness detection is customized to certify an authentication process of the voice data presented is genuine and not a recording or synthetic voice. With the rise of deepfakes and other equivalently sophisticated spoofing generation techniques, it’s becoming challenging to ensure that the person on the other end is a live speaker or not. Voice Liveness Detection (VLD) system is a group of security measures which detect and prevent voice spoofing attacks. Motivated by the recent development of the Kolmogorov-Arnold Network (KAN) based on the Kolmogorov-Arnold theorem, we proposed KAN for the VLD task. To date, multilayer perceptron (MLP) based classifiers have been used for the classification tasks. We aim to capture not only the compositional structure of the model but also to optimize the values of univariate functions. This study explains the mathematical as well as experimental analysis of KAN for VLD tasks, thereby opening a new perspective for scientists to work on speech and signal processing-based tasks. This study emerges as a combination of traditional signal processing tasks and new deep learning models, which further proved to be a better combination for VLD tasks. The experiments are performed on the POCO and ASVSpoof 2017 V2 database. We used Constant Q-transform, Mel, and short-time Fourier transform (STFT) based front-end features and used CNN, BiLSTM, and KAN as back-end classifiers. The best accuracy is 91.26 % on the POCO database using STFT features with the KAN classifier. In the ASVSpoof 2017 V2 database, the lowest EER we obtained was 26.42 %, using CQT features and KAN as a classifier.

Keywords: Kolmogorov Arnold networks, multilayer perceptron, pop noise, voice liveness detection

Procedia PDF Downloads 23
3808 CPW-Fed Broadband Circularly Polarized Planar Antenna with Improved Ground

Authors: Gnanadeep Gudapati, V. Annie Grace

Abstract:

A broadband circular polarization (CP) feature is designed for a CPW-fed planar printed monopole antenna. A rectangle patch and an improved ground plane make up the antenna. The antenna's impedance bandwidth can be increased by adding a vertical stub and a horizontal slit in the ground plane. The measured results show that the proposed antenna has a wide 10-dB return loss bandwidth of 70.2% (4.35GHz, 3.7-8.1GHz) centered at 4.2 GHz.

Keywords: CPW-fed, circular polarised, FR4 epoxy, slit and stub

Procedia PDF Downloads 138
3807 Effect of Extraction Methods on the Fatty Acids and Physicochemical Properties of Serendipity Berry Seed Oil

Authors: Olufunmilola A. Abiodun, Adegbola O. Dauda, Ayobami Ojo, Samson A. Oyeyinka

Abstract:

Serendipity berry (Dioscoreophyllum cumminsii diel) is a tropical dioecious rainforest vine and native to tropical Africa. The vine grows during the raining season and is used mainly as sweetener. The sweetener in the berry is known as monellin which is sweeter than sucrose. The sweetener is extracted from the fruits and the seed is discarded. The discarded seeds contain bitter principles but had high yield of oil. Serendipity oil was extracted using three methods (N-hexane, expression and expression/n-hexane). Fatty acids and physicochemical properties of the oil obtained were determined. The oil obtained was clear, liquid and have odour similar to hydrocarbon. The percentage oil yield was 38.59, 12.34 and 49.57% for hexane, expression and expression-hexane method respectively. The seed contained high percentage of oil especially using combination of expression and hexane. Low percentage of oil was obtained using expression method. The refractive index values obtained were 1.443, 1.442 and 1.478 for hexane, expression and expression-hexane methods respectively. Peroxide value obtained for expression-hexane was higher than those for hexane and expression. The viscosities of the oil were 125.8, 128.76 and 126.87 cm³/s for hexane, expression and expression-hexane methods respectively which showed that the oil from expression method was more viscous than the other oils. The major fatty acids in serendipity seed oil were oleic acid (62.81%), linoleic acid (22.65%), linolenic (6.11%), palmitic acid (5.67%), stearic acid (2.21%) in decreasing order. Oleic acid which is monounsaturated fatty acid had the highest value. Total unsaturated fatty acids were 91.574, 92.256 and 90.426% for hexane, expression, and expression-hexane respectively. Combination of expression and hexane for extraction of serendipity oil produced high yield of oil. The oil could be refined for food and non-food application.

Keywords: serendipity seed oil, expression method, fatty acid, hexane

Procedia PDF Downloads 260
3806 Analytical Development of a Failure Limit and Iso-Uplift Curves for Eccentrically Loaded Shallow Foundations

Authors: N. Abbas, S. Lagomarsino, S. Cattari

Abstract:

Examining existing experimental results for shallow rigid foundations subjected to vertical centric load (N), accompanied or not with a bending moment (M), two main non-linear mechanisms governing the cyclic ‎response of the soil-foundation system can be distinguished: foundation uplift and soil yielding. A soil-foundation failure limit, is defined as a domain of resistance in the two dimensional (2D) load space (N, M) inside of which lie all the admissible combinations of loads; these latter correspond to a pure elastic, non-linear elastic or plastic behavior of the soil-foundation system, while the points lying on the failure limit correspond to a combination of loads leading to a failure of the soil-foundation system. In this study, the proposed resistance domain is constructed analytically based on mechanics. Original elastic limit, uplift initiation ‎limit and iso-uplift limits are constructed inside this domain. These limits give a prediction ‎of the mechanisms activated for each combination of loads applied to the ‎foundation. A comparison of the proposed failure limit with experimental tests existing in the literature shows interesting results. Also, the developed uplift initiation limit and iso-uplift curves are confronted with others already proposed in the literature and widely used due to the absence of other alternatives, and remarkable differences are noted, showing evident errors in the past proposals and relevant accuracy for those given in the present work.

Keywords: foundation uplift, iso-uplift curves, resistance domain, soil yield

Procedia PDF Downloads 374
3805 An Approach to Solving Some Inverse Problems for Parabolic Equations

Authors: Bolatbek Rysbaiuly, Aliya S. Azhibekova

Abstract:

Problems concerning the interpretation of the well testing results belong to the class of inverse problems of subsurface hydromechanics. The distinctive feature of such problems is that additional information is depending on the capabilities of oilfield experiments. Another factor that should not be overlooked is the existence of errors in the test data. To determine reservoir properties, some inverse problems for parabolic equations were investigated. An approach to solving the inverse problems based on the method of regularization is proposed.

Keywords: iterative approach, inverse problem, parabolic equation, reservoir properties

Procedia PDF Downloads 419
3804 Exceptionally Glauconite-Rich Strata from the Miocene Bejaoua Facies of Northern Tunisia: Origin, Composition, and Depositional Conditions

Authors: Abdelbasset Tounekti, Kamel Boukhalfa, Tathagata Roy Choudhury, Mohamed Soussi, Santanu Banerjee

Abstract:

The exceptionally glauconite-rich Miocene strata are superbly exposed throughout the front of the nappes zone of northern Tunisia. Each of the glauconitic fine-grained intervals coincide with the peak rise of third order sea-level cycles during the Burdigalian-Langhiantime. These deposits show coarsening- and thickening-upward glauconitic shale and sandstone, recording a shallowing upward progression across offshore-shoreface settings. Petrographic investigation reveals that the glauconite was originated from the alteration of fecal pellets, and lithoclast including feldspar, volcanic particle, and quartz and infillings with intraparticle pores. Mineralogical analysis of both randomly oriented and air-dried, ethylene-glycolate, and heated glauconite pellets show the low intensity of (002) reflection peaks, indicating high iron substitution for aluminum in octahedral sites. Geochemical characterization of the Miocene glauconite reveals a high K2O and variable Fe2O3 (total) content. A combination of layer lattice and divertissement theories explains the origin of glauconite. The formation of glauconite was facilitated by the abundant supply of Fe through contemporaneous volcanism in Algeria and surrounding areas, which accompanied the African-European plate convergence. Therefore, the occurrence of glauconite in the Miocene succession of Tunisia is influenced by the combination of eustacy and volcanism.

Keywords: glauconite, autogenic, volcanism, geochemistry, chamosite, northern Tunisia, miocene

Procedia PDF Downloads 279
3803 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 26