Search results for: machine translation
356 Discerning Divergent Nodes in Social Networks
Authors: Mehran Asadi, Afrand Agah
Abstract:
In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.Keywords: online social networks, data mining, social cloud computing, interaction and collaboration
Procedia PDF Downloads 158355 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data
Authors: Michelangelo Sofo, Giuseppe Labianca
Abstract:
In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm
Procedia PDF Downloads 25354 “CheckPrivate”: Artificial Intelligence Powered Mobile Application to Enhance the Well-Being of Sextual Transmitted Diseases Patients in Sri Lanka under Cultural Barriers
Authors: Warnakulasuriya Arachichige Malisha Ann Rosary Fernando, Udalamatta Gamage Omila Chalanka Jinadasa, Bihini Pabasara Amandi Amarasinghe, Manul Thisuraka Mandalawatta, Uthpala Samarakoon, Manori Gamage
Abstract:
The surge in sexually transmitted diseases (STDs) has become a critical public health crisis demanding urgent attention and action. Like many other nations, Sri Lanka is grappling with a significant increase in STDs due to a lack of education and awareness regarding their dangers. Presently, the available applications for tracking and managing STDs cover only a limited number of easily detectable infections, resulting in a significant gap in effectively controlling their spread. To address this gap and combat the rising STD rates, it is essential to leverage technology and data. Employing technology to enhance the tracking and management of STDs is vital to prevent their further propagation and to enable early intervention and treatment. This requires adopting a comprehensive approach that involves raising public awareness about the perils of STDs, improving access to affordable healthcare services for early detection and treatment, and utilizing advanced technology and data analysis. The proposed mobile application aims to cater to a broad range of users, including STD patients, recovered individuals, and those unaware of their STD status. By harnessing cutting-edge technologies like image detection, symptom-based identification, prevention methods, doctor and clinic recommendations, and virtual counselor chat, the application offers a holistic approach to STD management. In conclusion, the escalating STD rates in Sri Lanka and across the globe require immediate action. The integration of technology-driven solutions, along with comprehensive education and healthcare accessibility, is the key to curbing the spread of STDs and promoting better overall public health.Keywords: STD, machine learning, NLP, artificial intelligence
Procedia PDF Downloads 81353 Determination of Community Based Reference Interval of Aspartate Aminotransferase to Platelet Ratio Index (APRI) among Healthy Populations in Mekelle City Tigray, Northern Ethiopia
Authors: Getachew Belay Kassahun
Abstract:
Background: Aspartate aminotransferase to Platelet Ratio Index (APRI) currently becomes a biomarker for screening liver fibrosis since liver biopsy procedure is invasive and variation in pathological interpretation. Clinical Laboratory Standard Institute recommends establishing age, sex and environment specific reference interval for biomarkers in a homogenous population. The current study was aimed to derive community based reference interval of APRI aged between 12 and 60 years old in Mekelle city Tigrai, Northern Ethiopia. Method: Six hundred eighty eight study participants were collected from three districts in Mekelle city. The 3 districts were selected through random sampling technique and sample size to kebelles (small administration) were distributed proportional to household number in each district. Lottery method was used at household level if more than 2 study participants to each age partition were found. A community based cross sectional in a total of 534 study participants, 264 male and 270 females, were included in the final laboratory and data analysis but around 154 study participants were excluded through exclusion criteria. Aspartate aminotransferase was analyzed through Biosystem chemistry analyzer and Sysmix machine was used to analyze platelet. Man Whitney U test non parametric stastical tool was used to appreciate stastical difference among gender after excluding the outliers through Box and Whisker. Result: The study appreciated stastical difference among gender for APRI reference interval. The combined, male and female reference interval in the current study was 0.098-0.390, 0.133-0.428 and 0.090-0.319 respectively. The upper and lower reference interval of males was higher than females in all age partition and there was no stastical difference (p-value (<0.05)) between age partition. Conclusion: The current study showed using sex specific reference interval is significant to APRI biomarker in clinical practice for result interpretation.Keywords: reference interval, aspartate aminotransferase to platelet ratio Index, Ethiopia, tigray
Procedia PDF Downloads 115352 Machine That Provides Mineral Fertilizer Equal to the Soil on the Slopes
Authors: Huseyn Nuraddin Qurbanov
Abstract:
The reliable food supply of the population of the republic is one of the main directions of the state's economic policy. Grain growing, which is the basis of agriculture, is important in this area. In the cultivation of cereals on the slopes, the application of equal amounts of mineral fertilizers the under the soil before sowing is a very important technological process. The low level of technical equipment in this area prevents producers from providing the country with the necessary quality cereals. Experience in the operation of modern technical means has shown that, at present, there is a need to provide an equal amount of fertilizer on the slopes to under the soil, fully meeting the agro-technical requirements. No fundamental changes have been made to the industrial machines that fertilize the under the soil, and unequal application of fertilizers under the soil on the slopes has been applied. This technological process leads to the destruction of new seedlings and reduced productivity due to intolerance to frost during the winter for the plant planted in the fall. In special climatic conditions, there is an optimal fertilization rate for each agricultural product. The application of fertilizers to the soil is one of the conditions that increase their efficiency in the field. As can be seen, the development of a new technical proposal for fertilizing and plowing the slopes in equal amounts on the slopes, improving the technological and design parameters, and taking into account the physical and mechanical properties of fertilizers is very important. Taking into account the above-mentioned issues, a combined plough was developed in our laboratory. Combined plough carries out pre-sowing technological operation in the cultivation of cereals, providing a smooth equal amount of mineral fertilizers under the soil on the slopes. Mathematical models of a smooth spreader that evenly distributes fertilizers in the field have been developed. Thus, diagrams and graphs obtained without distribution on the 8 partitions of the smooth spreader are constructed under the inclined angles of the slopes. Percentage and productivity of equal distribution in the field were noted by practical and theoretical analysis.Keywords: combined plough, mineral fertilizer, equal sowing, fertilizer norm, grain-crops, sowing fertilizer
Procedia PDF Downloads 138351 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”
Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen
Abstract:
Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval
Procedia PDF Downloads 170350 Analysis of Residents’ Travel Characteristics and Policy Improving Strategies
Authors: Zhenzhen Xu, Chunfu Shao, Shengyou Wang, Chunjiao Dong
Abstract:
To improve the satisfaction of residents' travel, this paper analyzes the characteristics and influencing factors of urban residents' travel behavior. First, a Multinominal Logit Model (MNL) model is built to analyze the characteristics of residents' travel behavior, reveal the influence of individual attributes, family attributes and travel characteristics on the choice of travel mode, and identify the significant factors. Then put forward suggestions for policy improvement. Finally, Support Vector Machine (SVM) and Multi-Layer Perceptron (MLP) models are introduced to evaluate the policy effect. This paper selects Futian Street in Futian District, Shenzhen City for investigation and research. The results show that gender, age, education, income, number of cars owned, travel purpose, departure time, journey time, travel distance and times all have a significant influence on residents' choice of travel mode. Based on the above results, two policy improvement suggestions are put forward from reducing public transportation and non-motor vehicle travel time, and the policy effect is evaluated. Before the evaluation, the prediction effect of MNL, SVM and MLP models was evaluated. After parameter optimization, it was found that the prediction accuracy of the three models was 72.80%, 71.42%, and 76.42%, respectively. The MLP model with the highest prediction accuracy was selected to evaluate the effect of policy improvement. The results showed that after the implementation of the policy, the proportion of public transportation in plan 1 and plan 2 increased by 14.04% and 9.86%, respectively, while the proportion of private cars decreased by 3.47% and 2.54%, respectively. The proportion of car trips decreased obviously, while the proportion of public transport trips increased. It can be considered that the measures have a positive effect on promoting green trips and improving the satisfaction of urban residents, and can provide a reference for relevant departments to formulate transportation policies.Keywords: neural network, travel characteristics analysis, transportation choice, travel sharing rate, traffic resource allocation
Procedia PDF Downloads 138349 Study on the Prediction of Serviceability of Garments Based on the Seam Efficiency and Selection of the Right Seam to Ensure Better Serviceability of Garments
Authors: Md Azizul Islam
Abstract:
Seam is the line of joining two separate fabric layers for functional or aesthetic purposes. Different kinds of seams are used for assembling the different areas or parts of the garment to increase serviceability. To empirically support the importance of seam efficiency on serviceability of garments, this study is focused on choosing the right type of seams for particular sewing parts of the garments based on the seam efficiency to ensure better serviceability. Seam efficiency is the ratio of seam strength and fabric strength. Single jersey knitted finished fabrics of four different GSMs (gram per square meter) were used to make the test garments T-shirt. Three distinct types of the seam: superimposed, lapped and flat seam was applied to the side seams of T-shirt and sewn by lockstitch (stitch class- 301) in a flat-bed plain sewing machine (maximum sewing speed: 5000 rpm) to make (3x4) 12 T-shirts. For experimental purposes, needle thread count (50/3 Ne), bobbin thread count (50/2 Ne) and the stitch density (stitch per inch: 8-9), Needle size (16 in singer system), stitch length (31 cm), and seam allowance (2.5cm) were kept same for all specimens. The grab test (ASTM D5034-08) was done in the Universal tensile tester to measure the seam strength and fabric strength. The produced T-shirts were given to 12 soccer players who wore the shirts for 20 soccer matches (each match of 90 minutes duration). Serviceability of the shirt were measured by visual inspection of a 5 points scale based on the seam conditions. The study found that T-shirts produced with lapped seam show better serviceability and T-shirts made of flat seams perform the lowest score in serviceability score. From the calculated seam efficiency (seam strength/ fabric strength), it was obvious that the performance (in terms of strength) of the lapped and bound seam is higher than that of the superimposed seam and the performance of superimposed seam is far better than that of the flat seam. So it can be predicted that to get a garment of high serviceability, lapped seams could be used instead of superimposed or other types of the seam. In addition, less stressed garments can be assembled by others seems like superimposed seams or flat seams.Keywords: seam, seam efficiency, serviceability, T-shirt
Procedia PDF Downloads 202348 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla
Abstract:
Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.Keywords: fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time
Procedia PDF Downloads 178347 Development of Bioplastic Disposable Food Packaging from Starch and Cellulose
Authors: Lidya Hailu, Ramesh Duraisamy, Masood Akhtar Khan, Belete Yilma
Abstract:
Disposable food packaging is a single-use plastics that can include any disposable plastic item which could be designed and use only once. In this context, this study aimed to prepare and evaluate bioplastic food packaging material from avocado seed starch and sugarcane bagasse cellulose and to characterise avocado seed starch. Performed the physicomechanical, structural, thermal properties, and biodegradability of raw materials and readily prepared bioplastic using the universal tensile testing machine, FTIR, UV-Vis spectroscopy, TGA, XRD, and SEM. Results have shown that an increasing amount of glycerol (3-5 mL) resulted in increases in water absorption, density, water vapor permeability, and elongation at the break of prepared bioplastic. However, it causes decreases in % transmittance, thermal degradation, and the tensile strength of prepared bioplastic. Likewise, the addition of cellulose fiber (0-15 %) increases % transmittance ranged (91.34±0.12-63.03±0.05 %), density (0.93±0.04-1.27±0.02 g/cm3), thermal degradation (310.01-321.61°C), tensile strength (2.91±6.18-4.21±6.713 MPa) of prepared bioplastic. On the other hand, it causes decreases in water absorption (14.4±0.25-9.40±0.007 %), water vapor permeability (9.306x10-12±0.3-3.57x10-12±0.15 g•s−1•m−1•Pa−1) and elongation at break (34.46±3.37-27.63±5.67 %) of prepared bioplastic. All the readily prepared bioplastic films rapidly degraded in the soil in the first 6 days and decompose within 12 days with a diminutive leftover and completely degraded within 15 days under an open soil atmosphere. Studied results showed starch derived bioplastic reinforced with 15 % cellulose fiber that plasticized with 3 mL of glycerol had improved results than other combinations of glycerol and bagasse cellulose with avocado seed starch. Thus, biodegradable disposable food packaging cup has been successfully produced in the lab-scale level using the studied approach. Biodegradable disposable food packaging materials have been successfully produced by employing avocado seed starch and sugarcane bagasse cellulose. The future study should be done on nano scale production since this study was done at the micro level.Keywords: avocado seed, food packaging, glycerol, sugarcane bagasse
Procedia PDF Downloads 338346 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society
Authors: Irene Yi
Abstract:
Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.Keywords: computational analysis, gendered grammar, misogynistic language, neural networks
Procedia PDF Downloads 119345 The Capacity of Bolted and Screw Connections in Cold-Formed Steel Truss Structure through Analytical and Experimental Method
Authors: Slamet Setioboro, Rahutami Kusumaningsih, Prabowo Setiyawan, Danna Darmayadi
Abstract:
Designing of cold-formed steel capacity connections often based on the formula used for hot rolled steel. It makes the result of the actual capacity connection doesn’t accurate anymore. When the hot rolled steel receives the axial load pull, it will have different characteristics. As the result, there will be failure result when designing Truss structure made of hot rolled steel. This research aims to determine the capacity of actual cold-formed steel connections section which is loaded by the axial tensile force. It will test the appeal of the connection using bolt grafting tool and screw grafting tool. The variations of the test will be on the type of connection (single and double slap), the number of the connection tools and connection configuration. Bold and screw connections failure mode observed in this research are different each other. Failure mode of bolted connections includes sliding pivot plate, tearing at the plate and cutting of the bolt head. While the failure mode of screw connections includes tilting, hole-bearing, pull over and cutting the screw body out. This research was conducted using a laboratory test of HW2-600S Universal Testing Machine model with ASTM E8. It has done in the materials testing laboratory of Mechanical Engineering Department, Faculty of Engineering UNNES. The results obtained through the laboratory diversification towards theoretical calculations using the standards specified in ISO 7971-2013 Cold-Rolled Steel Structures. Based on the research, it can be concluded that the effective connection in receiving force strength is bolted connections neither single nor double plate. The method used is by applying 4 bolts through 2 parallel lines configuration. Furthermore, this connection deals with the consequences of holding the highest Pmaks, lowest failure risk and getting a little kind of mode of failure.Keywords: axial load, cold-formed steel, capacity connections, bolted connections, screw connections
Procedia PDF Downloads 276344 Design Optimisation of a Novel Cross Vane Expander-Compressor Unit for Refrigeration System
Authors: Y. D. Lim, K. S. Yap, K. T. Ooi
Abstract:
In recent years, environmental issue has been a hot topic in the world, especially the global warming effect caused by conventional non-environmentally friendly refrigerants has increased. Several studies of a more energy-efficient and environmentally friendly refrigeration system have been conducted in order to tackle the issue. In search of a better refrigeration system, CO2 refrigeration system has been proposed as a better option. However, the high throttling loss involved during the expansion process of the refrigeration cycle leads to a relatively low efficiency and thus the system is impractical. In order to improve the efficiency of the refrigeration system, it is suggested by replacing the conventional expansion valve in the refrigeration system with an expander. Based on this issue, a new type of expander-compressor combined unit, named Cross Vane Expander-Compressor (CVEC) was introduced to replace the compressor and the expansion valve of a conventional refrigeration system. A mathematical model was developed to calculate the performance of CVEC, and it was found that the machine is capable of saving the energy consumption of a refrigeration system by as much as 18%. Apart from energy saving, CVEC is also geometrically simpler and more compact. To further improve its efficiency, optimization study of the device is carried out. In this report, several design parameters of CVEC were chosen to be the variables of optimization study. This optimization study was done in a simulation program by using complex optimization method, which is a direct search, multi-variables and constrained optimization method. It was found that the main design parameters, which was shaft radius was reduced around 8% while the inner cylinder radius was remained unchanged at its lower limit after optimization. Furthermore, the port sizes were increased to their upper limit after optimization. The changes of these design parameters have resulted in reduction of around 12% in the total frictional loss and reduction of 4% in power consumption. Eventually, the optimization study has resulted in an improvement in the mechanical efficiency CVEC by 4% and improvement in COP by 6%.Keywords: complex optimization method, COP, cross vane expander-compressor, CVEC, design optimization, direct search, energy saving, improvement, mechanical efficiency, multi variables
Procedia PDF Downloads 373343 Physics-Informed Neural Network for Predicting Strain Demand in Inelastic Pipes under Ground Movement with Geometric and Soil Resistance Nonlinearities
Authors: Pouya Taraghi, Yong Li, Nader Yoosef-Ghodsi, Muntaseer Kainat, Samer Adeeb
Abstract:
Buried pipelines play a crucial role in the transportation of energy products such as oil, gas, and various chemical fluids, ensuring their efficient and safe distribution. However, these pipelines are often susceptible to ground movements caused by geohazards like landslides, fault movements, lateral spreading, and more. Such ground movements can lead to strain-induced failures in pipes, resulting in leaks or explosions, leading to fires, financial losses, environmental contamination, and even loss of human life. Therefore, it is essential to study how buried pipelines respond when traversing geohazard-prone areas to assess the potential impact of ground movement on pipeline design. As such, this study introduces an approach called the Physics-Informed Neural Network (PINN) to predict the strain demand in inelastic pipes subjected to permanent ground displacement (PGD). This method uses a deep learning framework that does not require training data and makes it feasible to consider more realistic assumptions regarding existing nonlinearities. It leverages the underlying physics described by differential equations to approximate the solution. The study analyzes various scenarios involving different geohazard types, PGD values, and crossing angles, comparing the predictions with results obtained from finite element methods. The findings demonstrate a good agreement between the results of the proposed method and the finite element method, highlighting its potential as a simulation-free, data-free, and meshless alternative. This study paves the way for further advancements, such as the simulation-free reliability assessment of pipes subjected to PGD, as part of ongoing research that leverages the proposed method.Keywords: strain demand, inelastic pipe, permanent ground displacement, machine learning, physics-informed neural network
Procedia PDF Downloads 61342 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults
Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura
Abstract:
The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing
Procedia PDF Downloads 283341 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings
Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller
Abstract:
Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram
Procedia PDF Downloads 265340 Effects of Safety Intervention Program towards Behaviors among Rubber Wood Processing Workers Using Theory of Planned Behavior
Authors: Junjira Mahaboon, Anongnard Boonpak, Nattakarn Worrasan, Busma Kama, Mujalin Saikliang, Siripor Dankachatarn
Abstract:
Rubber wood processing is one of the most important industries in southern Thailand. The process has several safety hazards for example unsafe wood cutting machine guarding, wood dust, noise, and heavy lifting. However, workers’ occupational health and safety measures to promote their behaviors are still limited. This quasi-experimental research was to determine factors affecting workers’ safety behaviors using theory of planned behavior after implementing job safety intervention program. The purposes were to (1) determine factors affecting workers’ behaviors and (2) to evaluate effectiveness of the intervention program. The sample of study was 66 workers from a rubber wood processing factory. Factors in the Theory of Planned Behavior model (TPB) were measured before and after the intervention. The factors of TPB included attitude towards behavior, subjective norm, perceived behavioral control, intention, and behavior. Firstly, Job Safety Analysis (JSA) was conducted and Safety Standard Operation Procedures (SSOP) were established. The questionnaire was also used to collect workers’ characteristics and TPB factors. Then, job safety intervention program to promote workers’ behavior according to SSOP were implemented for a four month period. The program included SSOP training, personal protective equipment use, and safety promotional campaign. After that, the TPB factors were again collected. Paired sample t-test and independent t-test were used to analyze the data. The result revealed that attitude towards behavior and intention increased significantly after the intervention at p<0.05. These factors also significantly determined the workers’ safety behavior according to SSOP at p<0.05. However, subjective norm, and perceived behavioral control were not significantly changed nor related to safety behaviors. In conclusion, attitude towards behavior and workers’ intention should be promoted to encourage workers’ safety behaviors. SSOP intervention program e.g. short meeting, safety training, and promotional campaign should be continuously implemented in a routine basis to improve workers’ behavior.Keywords: job safety analysis, rubber wood processing workers, safety standard operation procedure, theory of planned behavior
Procedia PDF Downloads 193339 Agricultural Cooperative Model: A Panacea for Economic Development of Small Scale Business Famers in Ilesha, Osun State, Nigeria
Authors: Folasade Adegbaju, Olusola Arowolo, Olufisayo Onawumi
Abstract:
Owolowo ile – ege garri processing industry which is a small scale cassava processing industry, located in Ilesha, Osun State was purposively selected as a case study because it is a cooperative business. This industry was established in 1991 by eight men (8) who were mostly retirees. A researcher made questionnaire was used to collect information from thirty (30) respondents: the manager, four official staffs and 25 randomly selected processors in the industry. The study found that within twelve years of the utilization of their self raised initial capital of N240, 000 naira (Two hundred and forty thousand naira) this cassava – based industry had impacted on and attracted the involvement of many more people because within the period of the study (i.e. 2007-2011) the processors had quadrupled in number (e.g. 8 to 30), the facilities (equipment) in use had increased from one machine and a frying pot to many, this translated into being able to produce large quantities of fried garri, fufu and also starch for marketing to the people in Ilesha and neighbouring cities like Ibadan, Lagos, etc. This is indicative of economic growth. The industry also became a source of employment for community members in the sense that, as at the time of study four staffs were employed to work and coordinate the industry. It was observed that despite all odds of small-scale industry and the problem of people migrating from rural to urban area, this agro-based industry still existed successfully in the community, and many of such industry can be replicated by such agricultural cooperative groups nationwide so as to further boost the productivity as well as the economy of the area and nation at large. However, government and individual still have major roles to play in ensuring the growth and development of the nation in this respect.The local agricultural cooperative groups should form regional cooperative consortium with more networking for the farmers, in order to create more jobs for the young ones and to increase agricultural productivity in the country thus resulting in a better and more sustainable economy.Keywords: agricultural cooperative, cassava processing industry, model, small scale enterprise
Procedia PDF Downloads 290338 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data
Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang
Abstract:
Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.Keywords: biomarker, congenital heart defects, DNA methylation, random forest
Procedia PDF Downloads 158337 The Development of a Digitally Connected Factory Architecture to Enable Product Lifecycle Management for the Assembly of Aerostructures
Authors: Nicky Wilson, Graeme Ralph
Abstract:
Legacy aerostructure assembly is defined by large components, low build rates, and manual assembly methods. With an increasing demand for commercial aircraft and emerging markets such as the eVTOL (electric vertical take-off and landing) market, current methods of manufacturing are not capable of efficiently hitting these higher-rate demands. This project will look at how legacy manufacturing processes can be rate enabled by taking a holistic view of data usage, focusing on how data can be collected to enable fully integrated digital factories and supply chains. The study will focus on how data is flowed both up and down the supply chain to create a digital thread specific to each part and assembly while enabling machine learning through real-time, closed-loop feedback systems. The study will also develop a bespoke architecture to enable connectivity both within the factory and the wider PLM (product lifecycle management) system, moving away from traditional point-to-point systems used to connect IO devices to a hub and spoke architecture that will exploit report-by-exception principles. This paper outlines the key issues facing legacy aircraft manufacturers, focusing on what future manufacturing will look like from adopting Industry 4 principles. The research also defines the data architecture of a PLM system to enable the transfer and control of a digital thread within the supply chain and proposes a standardised communications protocol to enable a scalable solution to connect IO devices within a production environment. This research comes at a critical time for aerospace manufacturers, who are seeing a shift towards the integration of digital technologies within legacy production environments, while also seeing build rates continue to grow. It is vital that manufacturing processes become more efficient in order to meet these demands while also securing future work for many manufacturers.Keywords: Industry 4, digital transformation, IoT, PLM, automated assembly, connected factories
Procedia PDF Downloads 79336 Management Practices and Economic Performance of Smallholder Dairy Cattle Farms in Southern Vietnam
Authors: Ngoc-Hieu Vu
Abstract:
Although dairy production in Vietnam is a relatively new agricultural activity, milk production increased remarkably in recent years. Smallholders are still the main drivers for this development, especially in the southern part of the country. However, information on the farming practices is very limited. Therefore, this study aimed to characterize husbandry practices, educational experiences, decision-making practices, constraints, income and expenses of smallholder dairy farms in Southern Vietnam. A total of 200 farms, located in the regions Ho Chi Minh (HCM, N=80 farms), Lam Dong (N=40 farms), Binh Duong (N=40 farms) and Long An (N=40 farms) were included. Between October 2013 and December 2014 farmers were interviewed twice. On average, farms owned 3.200m2, 2.000m2, and 193m2 of pasture, cropping and housing area, respectively. The number of total, milking and dry cows, heifers, and calves were 20.4, 11.6, 4.7, 3.3, and 2.9 head. The number of lactating dairy cows was higher (p<0.001) in HCM (15.5) and Lam Dong (14.7) than in Binh Duong (6.7) and Long An (10.7). Animals were mainly crossbred Holstein-Friesian (HF) cows with at least 75% HF origin (84%), whereas a higher (P<0.001) percentage of purebred HF was found in HCM and Lam Dong and crossbreds in Binh Duong and Long An. Animals were mainly raised in tie-stalls (94%) and machine-milked (80%). Farmers used their own replacement animals (76%), and both genetic and phenotypic information (67%) for selecting sires. Farmers were predominantly educated at primary school level (53%). Major constraints for dairy farming were the lack of capital (43%), diseases (17%), marketing (22%), lack of knowledge (8%) and feed (7%). Monthly profit per lactating cow was superior in Lam Dong (2,817 thousand VND) and HCM (2,798 thousand VND) compared to other regions in Long An (2,597 thousand VND), and Binh Duong (1,775 thousand VND). Regional differences may be mainly attributed to environmental factors, urbanization, and particularly governmental support and the availability of extension and financial institutions. Results from this study provide important information on farming practices of smallholders in Southern Vietnam that are useful in determining regions that need to be addressed by authorities in order to improve dairy production.Keywords: dairy farms, milk yield, Southern Vietnam, socio-economics
Procedia PDF Downloads 465335 Translating the Australian National Health and Medical Research Council Obesity Guidelines into Practice into a Rural/Regional Setting in Tasmania, Australia
Authors: Giuliana Murfet, Heidi Behrens
Abstract:
Chronic disease is Australia’s biggest health concern and obesity the leading risk factor for many. Obesity and chronic disease have a higher representation in rural Tasmania, where levels of socio-disadvantage are also higher. People living outside major cities have less access to health services and poorer health outcomes. To help primary healthcare professionals manage obesity, the Australian NHMRC evidence-based clinical practice guidelines for management of overweight and obesity in adults were developed. They include recommendations for practice and models for obesity management. To our knowledge there has been no research conducted that investigates translation of these guidelines into practice in rural-regional areas; where implementation can be complicated by limited financial and staffing resources. Also, the systematic review that informed the guidelines revealed a lack of evidence for chronic disease models of obesity care. The aim was to establish and evaluate a multidisciplinary model for obesity management in a group of adult people with type 2 diabetes in a dispersed rural population in Australia. Extensive stakeholder engagement was undertaken to both garner support for an obesity clinic and develop a sustainable model of care. A comprehensive nurse practitioner-led outpatient model for obesity care was designed. Multidisciplinary obesity clinics for adults with type 2 diabetes including a dietitian, psychologist, physiotherapist and nurse practitioner were set up in the north-west of Tasmania at two geographically-rural towns. Implementation was underpinned by the NHMRC guidelines and recommendations focused on: assessment approaches; promotion of health benefits of weight loss; identification of relevant programs for individualising care; medication and bariatric surgery options for obesity management; and, the importance of long-term weight management. A clinical pathway for adult weight management is delivered by the multidisciplinary team with recognition of the impact of and adjustments needed for other comorbidities. The model allowed for intensification of intervention such as bariatric surgery according to recommendations, patient desires and suitability. A randomised controlled trial is ongoing, with the aim to evaluate standard care (diabetes-focused management) compared with an obesity-related approach with additional dietetic, physiotherapy, psychology and lifestyle advice. Key barriers and enablers to guideline implementation were identified that fall under the following themes: 1) health care delivery changes and the project framework development; 2) capacity and team-building; 3) stakeholder engagement; and, 4) the research project and partnerships. Engagement of not only local hospital but also state-wide health executives and surgical services committee were paramount to the success of the project. Staff training and collective development of the framework allowed for shared understanding. Staff capacity was increased with most taking on other activities (e.g., surgery coordination). Barriers were often related to differences of opinions in focus of the project; a desire to remain evidenced based (e.g., exercise prescription) without adjusting the model to allow for consideration of comorbidities. While barriers did exist and challenges overcome; the development of critical partnerships did enable the capacity for a potential model of obesity care for rural regional areas. Importantly, the findings contribute to the evidence base for models of diabetes and obesity care that coordinate limited resources.Keywords: diabetes, interdisciplinary, model of care, obesity, rural regional
Procedia PDF Downloads 228334 Analysis of Friction Stir Welding Process for Joining Aluminum Alloy
Authors: A. M. Khourshid, I. Sabry
Abstract:
Friction Stir Welding (FSW), a solid state joining technique, is widely being used for joining Al alloys for aerospace, marine automotive and many other applications of commercial importance. FSW were carried out using a vertical milling machine on Al 5083 alloy pipe. These pipe sections are relatively small in diameter, 5mm, and relatively thin walled, 2 mm. In this study, 5083 aluminum alloy pipe were welded as similar alloy joints using (FSW) process in order to investigate mechanical and microstructural properties .rotation speed 1400 r.p.m and weld speed 10,40,70 mm/min. In order to investigate the effect of welding speeds on mechanical properties, metallographic and mechanical tests were carried out on the welded areas. Vickers hardness profile and tensile tests of the joints as a metallurgical feasibility of friction stir welding for joining Al 6061 aluminum alloy welding was performed on pipe with different thickness 2, 3 and 4 mm,five rotational speeds (485,710,910,1120 and 1400) rpm and a traverse speed (4, 8 and 10)mm/min was applied. This work focuses on two methods such as artificial neural networks using software (pythia) and response surface methodology (RSM) to predict the tensile strength, the percentage of elongation and hardness of friction stir welded 6061 aluminum alloy. An artificial neural network (ANN) model was developed for the analysis of the friction stir welding parameters of 6061 pipe. The tensile strength, the percentage of elongation and hardness of weld joints were predicted by taking the parameters Tool rotation speed, material thickness and travel speed as a function. A comparison was made between measured and predicted data. Response surface methodology (RSM) also developed and the values obtained for the response Tensile strengths, the percentage of elongation and hardness are compared with measured values. The effect of FSW process parameter on mechanical properties of 6061 aluminum alloy has been analyzed in detail.Keywords: friction stir welding (FSW), al alloys, mechanical properties, microstructure
Procedia PDF Downloads 462333 Rapid Detection of the Etiology of Infection as Bacterial or Viral Using Infrared Spectroscopy of White Blood Cells
Authors: Uraib Sharaha, Guy Beck, Joseph Kapelushnik, Adam H. Agbaria, Itshak Lapidot, Shaul Mordechai, Ahmad Salman, Mahmoud Huleihel
Abstract:
Infectious diseases cause a significant burden on the public health and the economic stability of societies all over the world for several centuries. A reliable detection of the causative agent of infection is not possible based on clinical features, since some of these infections have similar symptoms, including fever, sneezing, inflammation, vomiting, diarrhea, and fatigue. Moreover, physicians usually encounter difficulties in distinguishing between viral and bacterial infections based on symptoms. Therefore, there is an ongoing need for sensitive, specific, and rapid methods for identification of the etiology of the infection. This intricate issue perplex doctors and researchers since it has serious repercussions. In this study, we evaluated the potential of the mid-infrared spectroscopic method for rapid and reliable identification of bacterial and viral infections based on simple peripheral blood samples. Fourier transform infrared (FTIR) spectroscopy is considered a successful diagnostic method in the biological and medical fields. Many studies confirmed the great potential of the combination of FTIR spectroscopy and machine learning as a powerful diagnostic tool in medicine since it is a very sensitive method, which can detect and monitor the molecular and biochemical changes in biological samples. We believed that this method would play a major role in improving the health situation, raising the level of health in the community, and reducing the economic burdens in the health sector resulting from the indiscriminate use of antibiotics. We collected peripheral blood samples from young 364 patients, of which 93 were controls, 126 had bacterial infections, and 145 had viral infections, with ages lower than18 years old, limited to those who were diagnosed with fever-producing illness. Our preliminary results showed that it is possible to determine the infectious agent with high success rates of 82% for sensitivity and 80% for specificity, based on the WBC data.Keywords: infectious diseases, (FTIR) spectroscopy, viral infections, bacterial infections.
Procedia PDF Downloads 139332 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling
Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal
Abstract:
Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining
Procedia PDF Downloads 173331 Three Issues for Integrating Artificial Intelligence into Legal Reasoning
Authors: Fausto Morais
Abstract:
Artificial intelligence has been widely used in law. Programs are able to classify suits, to identify decision-making patterns, to predict outcomes, and to formalize legal arguments as well. In Brazil, the artificial intelligence victor has been classifying cases to supreme court’s standards. When those programs act doing those tasks, they simulate some kind of legal decision and legal arguments, raising doubts about how artificial intelligence can be integrated into legal reasoning. Taking this into account, the following three issues are identified; the problem of hypernormatization, the argument of legal anthropocentrism, and the artificial legal principles. Hypernormatization can be seen in the Brazilian legal context in the Supreme Court’s usage of the Victor program. This program generated efficiency and consistency. On the other hand, there is a feasible risk of over standardizing factual and normative legal features. Then legal clerks and programmers should work together to develop an adequate way to model legal language into computational code. If this is possible, intelligent programs may enact legal decisions in easy cases automatically cases, and, in this picture, the legal anthropocentrism argument takes place. Such an argument argues that just humans beings should enact legal decisions. This is so because human beings have a conscience, free will, and self unity. In spite of that, it is possible to argue against the anthropocentrism argument and to show how intelligent programs may work overcoming human beings' problems like misleading cognition, emotions, and lack of memory. In this way, intelligent machines could be able to pass legal decisions automatically by classification, as Victor in Brazil does, because they are binding by legal patterns and should not deviate from them. Notwithstanding, artificial intelligent programs can be helpful beyond easy cases. In hard cases, they are able to identify legal standards and legal arguments by using machine learning. For that, a dataset of legal decisions regarding a particular matter must be available, which is a reality in Brazilian Judiciary. Doing such procedure, artificial intelligent programs can support a human decision in hard cases, providing legal standards and arguments based on empirical evidence. Those legal features claim an argumentative weight in legal reasoning and should serve as references for judges when they must decide to maintain or overcome a legal standard.Keywords: artificial intelligence, artificial legal principles, hypernormatization, legal anthropocentrism argument, legal reasoning
Procedia PDF Downloads 145330 Effect of Primer on Bonding between Resin Cement and Zirconia Ceramic
Authors: Deog-Gyu Seo, Jin-Soo Ahn
Abstract:
Objectives: Recently, the development of adhesive primers on stable bonding between zirconia and resin cement has been on the increase. The bond strength of zirconia-resin cement can be effectively increased with the treatment of primer composed of the adhesive monomer that can chemically bond with the oxide layer, which forms on the surface of zirconia. 10-methacryloyloxydecyl dihydrogen phosphate (10-MDP) that contains phosphate ester and acidic monomer 4-methacryloxyethyl trimellitic anhydride(4-META) have been suggested as monomers that can form chemical bond with the surface oxide layer of zirconia. Also, these suggested monomers have proved to be effective zirconia surface treatment for bonding to resin cement. The purpose of this study is to evaluate the effects of primer treatment on the bond strength of Zirconia-resin cement by using three different kinds of primers on the market. Methods: Zirconia blocks were prepared into 60 disk-shaped specimens by using a diamond saw. Specimens were divided into four different groups: first three groups were treated with zirconiaLiner(Sun Medical Co., Ltd., Furutaka-cho, Moriyama, Shiga, Japan), Alloy primer (Kuraray Noritake Dental Inc., Sakaju, Kurashiki, Okayama, Japan), and Universal primer (Tokuyama dental Corp., Taitou, Taitou-ku, Tokyo, Japan) respectively. The last group was the control with no surface treatment. Dual cured resin cement (Biscem, Bisco Inc., Schaumburg, IL, USA) was luted to each group of specimens. And then, shear bond strengths were measured by universal tesing machine. The significance of the result was statistically analyzed by one-way ANOVA and Tukey test. The failure sites in each group were inspected under a magnifier. Results: Mean shear bond strength were 0.60, 1.39, 1.03, 1.38 MPa for control, Zirconia Liner (ZL), Alloy primer (AP), Universal primer (UP), respectively. Groups with application of each of the three primers showed significantly higher shear bond strength compared to the control group (p < 0.05). Among the three groups with the treatment, ZL and UP showed significantly higher shear bond strength than AP (p < 0.05), and there were no significant differences in mean shear bond strength between ZL and UP (p < 0.05). While the most specimens of control groups showed adhesive failure (80%), the most specimens of three primer-treated groups showed cohesive or mixed failure (80%).Keywords: primer, resin cement, shear bond strength, zirconia
Procedia PDF Downloads 202329 Kinematic Analysis of Human Gait for Typical Postures of Walking, Running and Cart Pulling
Authors: Nupur Karmaker, Hasin Aupama Azhari, Abdul Al Mortuza, Abhijit Chanda, Golam Abu Zakaria
Abstract:
Purpose: The purpose of gait analysis is to determine the biomechanics of the joint, phases of gait cycle, graphical and analytical analysis of degree of rotation, analysis of the electrical activity of muscles and force exerted on the hip joint at different locomotion during walking, running and cart pulling. Methods and Materials: Visual gait analysis and electromyography method has been used to detect the degree of rotation of joints and electrical activity of muscles. In cinematography method an object is observed from different sides and takes its video. Cart pulling length has been divided into frames with respect to time by using video splitter software. Phases of gait cycle, degree of rotation of joints, EMG profile and force analysis during walking and running has been taken from different papers. Gait cycle and degree of rotation of joints during cart pulling has been prepared by using video camera, stop watch, video splitter software and Microsoft Excel. Results and Discussion: During the cart pulling the force exerted on hip is the resultant of various forces. The force on hip is the vector sum of the force Fg= mg, due the body of weight of the person and Fa= ma, due to the velocity. Maximum stance phase shows during cart pulling and minimum shows during running. During cart pulling shows maximum degree of rotation of hip joint, knee: running, and ankle: cart pulling. During walking, it has been observed minimum degree of rotation of hip, ankle: during running. During cart pulling, dynamic force depends on the walking velocity, body weight and load weight. Conclusions: 80% people suffer gait related disease with increasing their age. Proper care should take during cart pulling. It will be better to establish the gait laboratory to determine the gait related diseases. If the way of cart pulling is changed i.e the design of cart pulling machine, load bearing system is changed then it would possible to reduce the risk of limb loss, flat foot syndrome and varicose vein in lower limb.Keywords: kinematic, gait, gait lab, phase, force analysis
Procedia PDF Downloads 576328 Exploring Pre-Trained Automatic Speech Recognition Model HuBERT for Early Alzheimer’s Disease and Mild Cognitive Impairment Detection in Speech
Authors: Monica Gonzalez Machorro
Abstract:
Dementia is hard to diagnose because of the lack of early physical symptoms. Early dementia recognition is key to improving the living condition of patients. Speech technology is considered a valuable biomarker for this challenge. Recent works have utilized conventional acoustic features and machine learning methods to detect dementia in speech. BERT-like classifiers have reported the most promising performance. One constraint, nonetheless, is that these studies are either based on human transcripts or on transcripts produced by automatic speech recognition (ASR) systems. This research contribution is to explore a method that does not require transcriptions to detect early Alzheimer’s disease (AD) and mild cognitive impairment (MCI). This is achieved by fine-tuning a pre-trained ASR model for the downstream early AD and MCI tasks. To do so, a subset of the thoroughly studied Pitt Corpus is customized. The subset is balanced for class, age, and gender. Data processing also involves cropping the samples into 10-second segments. For comparison purposes, a baseline model is defined by training and testing a Random Forest with 20 extracted acoustic features using the librosa library implemented in Python. These are: zero-crossing rate, MFCCs, spectral bandwidth, spectral centroid, root mean square, and short-time Fourier transform. The baseline model achieved a 58% accuracy. To fine-tune HuBERT as a classifier, an average pooling strategy is employed to merge the 3D representations from audio into 2D representations, and a linear layer is added. The pre-trained model used is ‘hubert-large-ls960-ft’. Empirically, the number of epochs selected is 5, and the batch size defined is 1. Experiments show that our proposed method reaches a 69% balanced accuracy. This suggests that the linguistic and speech information encoded in the self-supervised ASR-based model is able to learn acoustic cues of AD and MCI.Keywords: automatic speech recognition, early Alzheimer’s recognition, mild cognitive impairment, speech impairment
Procedia PDF Downloads 127327 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices
Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays
Abstract:
Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.Keywords: ecological momentary assessment, real-time, stress, work
Procedia PDF Downloads 161