Search results for: link data
23333 Optimal Design of Step-Stress Partially Life Test Using Multiply Censored Exponential Data with Random Removals
Authors: Showkat Ahmad Lone, Ahmadur Rahman, Ariful Islam
Abstract:
The major assumption in accelerated life tests (ALT) is that the mathematical model relating the lifetime of a test unit and the stress are known or can be assumed. In some cases, such life–stress relationships are not known and cannot be assumed, i.e. ALT data cannot be extrapolated to use condition. So, in such cases, partially accelerated life test (PALT) is a more suitable test to be performed for which tested units are subjected to both normal and accelerated conditions. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests using progressive failure-censored hybrid data with random removals. The life data of the units under test is considered to follow exponential life distribution. The removals from the test are assumed to have binomial distributions. The point and interval maximum likelihood estimations are obtained for unknown distribution parameters and tampering coefficient. An optimum test plan is developed using the D-optimality criterion. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.Keywords: binomial distribution, d-optimality, multiple censoring, optimal design, partially accelerated life testing, simulation study
Procedia PDF Downloads 32323332 Animations for Teaching Food Chemistry: A Design Approach for Linking Chemistry Theory to Everyday Food
Authors: Paulomi (Polly) Burey, Zoe Lynch
Abstract:
In STEM education, students often have difficulty linking static images and words from textbooks or online resources, to the underlying mechanisms of the topic of study. This can often dissuade some students from pursuing study in the physical and chemical sciences. A growing movement in current day students demonstrates that the YouTube generation feel they learn best from video or dynamic, interactive learning tools, and will seek these out as alternatives to their textbooks and the classroom learning environment. Chemistry, and in particular visualization of molecular structures in everyday materials, can prove difficult to comprehend without significant interaction with the teacher of the content and concepts, beyond the timeframe of a typical class. This can cause a learning hurdle for distance education students, and so it is necessary to provide strong electronic tools and resources to aid their learning. As one of the electronic resources, an animation design approach to link everyday materials to their underlying chemistry would be beneficial for student learning, with the focus here being on food. These animations were designed and storyboarded with a scaling approach and commence with a focus on the food material itself and its component parts. This is followed by animated transitions to its underlying microstructure and identifying features, and finally showing the molecules responsible for these microstructural features. The animation ends with a reverse transition back through the molecular structure, microstructure, all the way back to the original food material, and also animates some reactions that may occur during food processing to demonstrate the purpose of the underlying chemistry and how it affects the food we eat. Using this cyclical approach of linking students’ existing knowledge of food to help guide them to understanding more complex knowledge, and then reinforcing their learning by linking back to their prior knowledge again, enhances student understanding. Food is also an ideal material system for students to interact with, in a hands-on manner to further reinforce their learning. These animations were launched this year in a 2nd year University Food Chemistry course with improved learning outcomes for the cohort.Keywords: chemistry, food science, future pedagogy, STEM Education
Procedia PDF Downloads 16023331 A Real Time Development Study for Automated Centralized Remote Monitoring System at Royal Belum Forest
Authors: Amri Yusoff, Shahrizuan Shafiril, Ashardi Abas, Norma Che Yusoff
Abstract:
Nowadays, illegal logging has been causing much effect to our forest. Some of it causes a flash flood, avalanche, global warming, and etc. This comprehensibly makes us wonder why, what, and who has made it happened. Often, it already has been too late after we have known the cause of it. Even the Malaysian Royal Belum forest has not been spared from land clearing or illegal activity by the natives although this area has been gazetted as a protected area preserved for future generations. Furthermore, because of its sizeable and wide area, these illegal activities are difficult to monitor and to maintain. A critical action must be called upon to prevent all of these unhealthy activities from recurrence. Therefore, a remote monitoring device must be developed in order to capture critical real-time data such as temperature, humidity, gaseous, fire, and rain detection which indicates the current and preserved natural state and habitat in the forest. Besides, this device location can be detected via GPS by showing the latitudes and longitudes of its current location and then to be transmitted by SMS via GSM system. All of its readings will be sent in real-time for data management and analysis. This result will be benefited to the monitoring bodies or relevant authority in keeping the forest in the natural habitat. Furthermore, this research is to gather a unified data and then will be analysed for its comparison with an existing method.Keywords: remote monitoring system, forest data, GSM, GPS, wireless sensor
Procedia PDF Downloads 42023330 Exploring the Inter-firm Collaborating and Supply Chain Innovation in the Pharmaceutical Industry
Authors: Fatima Gouiferda
Abstract:
Uncertainty and competitiveness are changing firm’s environment to become more complicated. The competition is moving to supply chain’s level, and firms need to collaborate and innovate to survive. In the current economy, common efforts between organizations and developing new capacities mutually are the key resources in gaining collaborative advantage and enhancing supply chain performance. The purpose of this paper is to explore different practices of collaboration activities that exist in the pharmaceutical industry of Morocco. Also, to inquire how these practices affect supply chain performance. The exploration is based on interpretativism research paradigm. Data were collected through semi-structured interviews from supply chain practitioners. Qualitative data was analyzed via Iramuteq software to explore different themes of the study.The findings include descriptive analysis as a result of data processing using Iramuteq. It also encompasses the content analysis of the themes extracted from interviews.Keywords: inter-firm relationships, collaboration, supply chain innovation, morocco
Procedia PDF Downloads 6423329 Grid and Market Integration of Large Scale Wind Farms using Advanced Predictive Data Mining Techniques
Authors: Umit Cali
Abstract:
The integration of intermittent energy sources like wind farms into the electricity grid has become an important challenge for the utilization and control of electric power systems, because of the fluctuating behaviour of wind power generation. Wind power predictions improve the economic and technical integration of large amounts of wind energy into the existing electricity grid. Trading, balancing, grid operation, controllability and safety issues increase the importance of predicting power output from wind power operators. Therefore, wind power forecasting systems have to be integrated into the monitoring and control systems of the transmission system operator (TSO) and wind farm operators/traders. The wind forecasts are relatively precise for the time period of only a few hours, and, therefore, relevant with regard to Spot and Intraday markets. In this work predictive data mining techniques are applied to identify a statistical and neural network model or set of models that can be used to predict wind power output of large onshore and offshore wind farms. These advanced data analytic methods helps us to amalgamate the information in very large meteorological, oceanographic and SCADA data sets into useful information and manageable systems. Accurate wind power forecasts are beneficial for wind plant operators, utility operators, and utility customers. An accurate forecast allows grid operators to schedule economically efficient generation to meet the demand of electrical customers. This study is also dedicated to an in-depth consideration of issues such as the comparison of day ahead and the short-term wind power forecasting results, determination of the accuracy of the wind power prediction and the evaluation of the energy economic and technical benefits of wind power forecasting.Keywords: renewable energy sources, wind power, forecasting, data mining, big data, artificial intelligence, energy economics, power trading, power grids
Procedia PDF Downloads 52023328 Climate Change and Its Impact on Water Security and Health in Coastal Community: A Gender Outlook
Authors: Soorya Vennila
Abstract:
The present study answers the questions; how does climate change affect the water security in drought prone Ramanathapuram district? and what has water insecurity done to the health of the coastal community? The study area chosen is Devipattinam in Ramanathapuram district. Climate change evidentially wreaked havoc on the community with saltwater intrusion, water quality degradation, water scarcity and its eventual economic, social like power inequality within family and community and health hazards. The climatological data such as rainfall, minimum temperature and maximum temperature were statistically analyzed for trend using Mann-Kendall test. The test was conducted for 14 years (1989-2002) of rainfall data, maximum and minimum temperature and the data were statistically analyzed. At the outset, the water quality samples were collected from Devipattinam to test its physical and chemical parameters and their spatial variation. The results were derived as shown in ARC GIS. Using the water quality test water quality index were framed. And finally, key Informant interview, questionnaire were conducted to capture the gender perception and problem. The data collected were thereafter interpreted using SPSS software for recommendations and suggestions to overcome water scarcity and health problems.Keywords: health, watersecurity, water quality, climate change
Procedia PDF Downloads 8023327 Communication of Sensors in Clustering for Wireless Sensor Networks
Authors: Kashish Sareen, Jatinder Singh Bal
Abstract:
The use of wireless sensor networks (WSNs) has grown vastly in the last era, pointing out the crucial need for scalable and energy-efficient routing and data gathering and aggregation protocols in corresponding large-scale environments. Wireless Sensor Networks have now recently emerged as a most important computing platform and continue to grow in diverse areas to provide new opportunities for networking and services. However, the energy constrained and limited computing resources of the sensor nodes present major challenges in gathering data. The sensors collect data about their surrounding and forward it to a command centre through a base station. The past few years have witnessed increased interest in the potential use of wireless sensor networks (WSNs) as they are very useful in target detecting and other applications. However, hierarchical clustering protocols have maximum been used in to overall system lifetime, scalability and energy efficiency. In this paper, the state of the art in corresponding hierarchical clustering approaches for large-scale WSN environments is shown.Keywords: clustering, DLCC, MLCC, wireless sensor networks
Procedia PDF Downloads 48323326 Democratic Political Culture of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok
Authors: Vilasinee Jintalikhitdee, Phusit Phukamchanoad, Sakapas Saengchai
Abstract:
This research aims to study the level of democratic political culture and the factors that affect the democratic political culture of 5th and 6th graders under the authority of Dusit District Office, Bangkok by using stratified sampling for probability sampling and using purposive sampling for non-probability sampling to collect data toward the distribution of questionnaires to 300 respondents. This covers all of the schools under the authority of Dusit District Office. The researcher analyzed the data by using descriptive statistics which include arithmetic mean, standard deviation, and inferential statistics which are Independent Samples T-test (T-test) and One-Way ANOVA (F-test). The researcher also collected data by interviewing the target groups, and then analyzed the data by the use of descriptive analysis. The result shows that 5th and 6th graders under the authority of Dusit District Office, Bangkok have exposed to democratic political culture at high level in overall. When considering each part, it found out that the part that has highest mean is “the constitutional democratic governmental system is suitable for Thailand” statement. The part with the lowest mean is “corruption (cheat and defraud) is normal in Thai society” statement. The factor that affects democratic political culture is grade levels, occupations of mothers, and attention in news and political movements.Keywords: democratic, political culture, political movements, democratic governmental system
Procedia PDF Downloads 26723325 Saudi Human Awareness Needs: A Survey in How Human Causes Errors and Mistakes Leads to Leak Confidential Data with Proposed Solutions in Saudi Arabia
Authors: Amal Hussain Alkhaiwani, Ghadah Abdullah Almalki
Abstract:
Recently human errors have increasingly become a very high factor in security breaches that may affect confidential data, and most of the cyber data breaches are caused by human errors. With one individual mistake, the attacker will gain access to the entire network and bypass the implemented access controls without any immediate detection. Unaware employees will be vulnerable to any social engineering cyber-attacks. Providing security awareness to People is part of the company protection process; the cyber risks cannot be reduced by just implementing technology; the human awareness of security will significantly reduce the risks, which encourage changes in staff cyber-awareness. In this paper, we will focus on Human Awareness, human needs to continue the required security education level; we will review human errors and introduce a proposed solution to avoid the breach from occurring again. Recently Saudi Arabia faced many attacks with different methods of social engineering. As Saudi Arabia has become a target to many countries and individuals, we needed to initiate a defense mechanism that begins with awareness to keep our privacy and protect the confidential data against possible intended attacks.Keywords: cybersecurity, human aspects, human errors, human mistakes, security awareness, Saudi Arabia, security program, security education, social engineering
Procedia PDF Downloads 16323324 Hepatocyte-Intrinsic NF-κB Signaling Is Essential to Control a Systemic Viral Infection
Authors: Sukumar Namineni, Tracy O'Connor, Ulrich Kalinke, Percy Knolle, Mathias Heikenwaelder
Abstract:
The liver is one of the pivotal organs in vertebrate animals, serving a multitude of functions such as metabolism, detoxification and protein synthesis and including a predominant role in innate immunity. The innate immune mechanisms pertaining to liver in controlling viral infections have largely been attributed to the Kupffer cells, the locally resident macrophages. However, all the cells of liver are equipped with innate immune functions including, in particular, the hepatocytes. Hence, our aim in this study was to elucidate the innate immune contribution of hepatocytes in viral clearance using mice lacking Ikkβ specifically in the hepatocytes, termed IkkβΔᴴᵉᵖ mice. Blockade of Ikkβ activation in IkkβΔᴴᵉᵖ mice affects the downstream signaling of canonical NF-κB signaling by preventing the nuclear translocation of NF-κB, an important step required for the initiation of innate immune responses. Interestingly, infection of IkkβΔᴴᵉᵖ mice with lymphocytic choriomeningitis virus (LCMV) led to strongly increased hepatic viral titers – mainly confined in clusters of infected hepatocytes. This was due to reduced interferon stimulated gene (ISG) expression during the onset of infection and a reduced CD8+ T-cell-mediated response. Decreased ISG production correlated with increased liver LCMV protein and LCMV in isolated hepatocytes from IkkβΔᴴᵉᵖ mice. A similar phenotype was found in LCMV-infected mice lacking interferon signaling in hepatocytes (IFNARΔᴴᵉᵖ) suggesting a link between NFkB and interferon signaling in hepatocytes. We also observed a failure of interferon-mediated inhibition of HBV replication in HepaRG cells treated with NF-kB inhibitors corroborating our initial findings with LCMV infections. Collectively, these results clearly highlight a previously unknown and influential role of hepatocytes in the induction of innate immune responses leading to viral clearance during a systemic viral infection with LCMV-WE.Keywords: CD8+ T cell responses, innate immune mechanisms in the liver, interferon signaling, interferon stimulated genes, NF-kB signaling, viral clearance
Procedia PDF Downloads 19223323 The Quality of Food and Drink Product Labels Translation from Indonesian into English
Authors: Rudi Hartono, Bambang Purwanto
Abstract:
The translation quality of food and drink labels from Indonesian into English is poor because the translation is not accurate, less natural, and difficult to read. The label translation can be found in some cans packages of food and drink products produced and marketed by several companies in Indonesia. If this problem is left unchecked, it will lead to a misunderstanding on the translation results and make consumers confused. This study was conducted to analyze the translation errors on food and drink products labels and formulate the solution for the better translation quality. The research design was the evaluation research with a holistic criticism approach. The data used were words, phrases, and sentences translated from Indonesian to English language printed on food and drink product labels. The data were processed by using Interactive Model Analysis that carried out three main steps: collecting, classifying, and verifying data. Furthermore, the data were analyzed by using content analysis to view the accuracy, naturalness, and readability of translation. The results showed that the translation quality of food and drink product labels from Indonesian to English has the level of accuracy (60%), level of naturalness (50%), and level readability (60%). This fact needs a help to create an effective strategy for translating food and drink product labels later.Keywords: translation quality, food and drink product labels, a holistic criticism approach, interactive model, content analysis
Procedia PDF Downloads 37723322 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors
Authors: Sudhir Kumar Singh, Debashish Chakravarty
Abstract:
Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.Keywords: finite element method, geotechnical engineering, machine learning, slope stability
Procedia PDF Downloads 10323321 A Patent Trend Analysis for Hydrogen Based Ironmaking: Identifying the Technology’s Development Phase
Authors: Ebru Kaymaz, Aslı İlbay Hamamcı, Yakup Enes Garip, Samet Ay
Abstract:
The use of hydrogen as a fuel is important for decreasing carbon emissions. For the steel industry, reducing carbon emissions is one of the most important agendas of recent times globally. Because of the Paris Agreement requirements, European steel industry studies on green steel production. Although many literature reviews have analyzed this topic from technological and hydrogen based ironmaking, there are very few studies focused on patents of decarbonize parts of the steel industry. Hence, this study focus on technological progress of hydrogen based ironmaking and on understanding the main trends through patent data. All available patent data were collected from Questel Orbit. The trend analysis of more than 900 patent documents has been carried out by using Questel Orbit Intellixir to analyze a large number of data for scientific intelligence.Keywords: hydrogen based ironmaking, DRI, direct reduction, carbon emission, steelmaking, patent analysis
Procedia PDF Downloads 14723320 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data
Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill
Abstract:
Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function
Procedia PDF Downloads 28023319 Evaluation of Symptoms, Laboratory Findings, and Natural History of IgE Mediated Wheat Allergy
Authors: Soudeh Tabashi, Soudabeh Fazeli Dehkordy, Masood Movahedi, Nasrin Behniafard
Abstract:
Introduction: Food allergy has increased in three last decades. Since wheat is one of the major constituents of daily meal in many regions throughout the world, wheat allergy is one of the most important allergies ranking among the 8 most common types of food allergies. Our information about epidemiology and etiology of food allergies are limited. Therefore, in this study we sought to evaluate the symptoms and laboratory findings in children with wheat allergy. Materials and methods: There were 23 patients aged up to 18 with the diagnosis of IgE mediated wheat allergy that were included enrolled in this study. Using a questionnaire .we collected their information and organized them into 4 groups categories of: demographic data identification, signs and symptoms, comorbidities, and laboratory data. Then patients were followed up for 6 month and their lab data were compared together. Results: Most of the patients (82%) presented the symptoms of wheat allergy in the first year of their life. The skin and the respiratory system were the most commonly involved organs with an incidence of 86% and 78% respectively. Most of the patients with wheat allergy were also sensitive to the other type of foods and their sensitivity to egg were most common type (47%). in 57% of patients, IgE levels were decreased during the 6 month follow-up period. Conclusion: We do not have enough information about data on epidemiology and response to therapy of wheat allergy and to best of our knowledge no study has addressed this issue in Iran so far. This study is the first source of information about IgE mediated wheat allergy in Iran and It can provide an opening for future studies about wheat allergy and its treatments.Keywords: wheat allergy, food allergy, IgE, food allergy
Procedia PDF Downloads 19423318 Development of a Low-Cost Smart Insole for Gait Analysis
Authors: S. M. Khairul Halim, Mojtaba Ghodsi, Morteza Mohammadzaheri
Abstract:
Gait analysis is essential for diagnosing musculoskeletal and neurological conditions. However, current methods are often complex and expensive. This paper introduces a methodology for analysing gait parameters using a smart insole with a built-in accelerometer. The system measures stance time, swing time, step count, and cadence and wirelessly transmits data to a user-friendly IoT dashboard for centralized processing. This setup enables remote monitoring and advanced data analytics, making it a versatile tool for medical diagnostics and everyday usage. Integration with IoT enhances the portability and connectivity of the device, allowing for secure, encrypted data access over the Internet. This feature supports telemedicine and enables personalized treatment plans tailored to individual needs. Overall, the approach provides a cost-effective (almost 25 GBP), accurate, and user-friendly solution for gait analysis, facilitating remote tracking and customized therapy.Keywords: gait analysis, IoT, smart insole, accelerometer sensor
Procedia PDF Downloads 2123317 Comparison between Some of Robust Regression Methods with OLS Method with Application
Authors: Sizar Abed Mohammed, Zahraa Ghazi Sadeeq
Abstract:
The use of the classic method, least squares (OLS) to estimate the linear regression parameters, when they are available assumptions, and capabilities that have good characteristics, such as impartiality, minimum variance, consistency, and so on. The development of alternative statistical techniques to estimate the parameters, when the data are contaminated with outliers. These are powerful methods (or resistance). In this paper, three of robust methods are studied, which are: Maximum likelihood type estimate M-estimator, Modified Maximum likelihood type estimate MM-estimator and Least Trimmed Squares LTS-estimator, and their results are compared with OLS method. These methods applied to real data taken from Duhok company for manufacturing furniture, the obtained results compared by using the criteria: Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE) and Mean Sum of Absolute Error (MSAE). Important conclusions that this study came up with are: a number of typical values detected by using four methods in the furniture line and very close to the data. This refers to the fact that close to the normal distribution of standard errors, but typical values in the doors line data, using OLS less than that detected by the powerful ways. This means that the standard errors of the distribution are far from normal departure. Another important conclusion is that the estimated values of the parameters by using the lifeline is very far from the estimated values using powerful methods for line doors, gave LTS- destined better results using standard MSE, and gave the M- estimator better results using standard MAPE. Moreover, we noticed that using standard MSAE, and MM- estimator is better. The programs S-plus (version 8.0, professional 2007), Minitab (version 13.2) and SPSS (version 17) are used to analyze the data.Keywords: Robest, LTS, M estimate, MSE
Procedia PDF Downloads 23323316 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients
Authors: Soha A. Bahanshal, Byung G. Kim
Abstract:
Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission
Procedia PDF Downloads 18723315 The Comparison of Joint Simulation and Estimation Methods for the Geometallurgical Modeling
Authors: Farzaneh Khorram
Abstract:
This paper endeavors to construct a block model to assess grinding energy consumption (CCE) and pinpoint blocks with the highest potential for energy usage during the grinding process within a specified region. Leveraging geostatistical techniques, particularly joint estimation, or simulation, based on geometallurgical data from various mineral processing stages, our objective is to forecast CCE across the study area. The dataset encompasses variables obtained from 2754 drill samples and a block model comprising 4680 blocks. The initial analysis encompassed exploratory data examination, variography, multivariate analysis, and the delineation of geological and structural units. Subsequent analysis involved the assessment of contacts between these units and the estimation of CCE via cokriging, considering its correlation with SPI. The selection of blocks exhibiting maximum CCE holds paramount importance for cost estimation, production planning, and risk mitigation. The study conducted exploratory data analysis on lithology, rock type, and failure variables, revealing seamless boundaries between geometallurgical units. Simulation methods, such as Plurigaussian and Turning band, demonstrated more realistic outcomes compared to cokriging, owing to the inherent characteristics of geometallurgical data and the limitations of kriging methods.Keywords: geometallurgy, multivariate analysis, plurigaussian, turning band method, cokriging
Procedia PDF Downloads 7023314 Early Transcriptome Responses to Piscine orthoreovirus-1 in Atlantic salmon Erythrocytes Compared to Salmonid Kidney Cell Lines
Authors: Thomais Tsoulia, Arvind Y. M. Sundaram, Stine Braaen, Øyvind Haugland, Espen Rimstad, Øystein Wessel, Maria K. Dahle
Abstract:
Fish red blood cells (RBC) are nucleated, and in addition to their function in gas exchange, they have been characterized as mediators of immune responses. Salmonid RBC are the major target cells of Piscineorthoreovirus (PRV), a virus associated with heart and skeletal muscle inflammation (HSMI) in farmed Atlantic salmon. The activation of antiviral response genesin RBChas previously been described in ex vivo and in vivo PRV-infection models, but not explored in the initial virus encounter phase. In the present study, mRNA transcriptome responses were explored in erythrocytes from individual fish, kept ex vivo, and exposed to purified PRV for 24 hours. The responses were compared to responses in macrophage-like salmon head kidney (SHK-1) and endothelial-like Atlantic salmon kidney (ASK) cells, none of which support PRV replication. The comparative analysis showed that the antiviral response to PRV was strongest in the SHK-1 cells, with a set of 80 significantly induced genes (≥ 2-fold upregulation). In RBC, 46 genes were significantly upregulated, while ASK cells were not significantly responsive. In particular, the transcriptome analysis of RBC revealed that PRV significantly induced interferon regulatory factor 1 (IRF1) and interferon-induced protein with tetratricopeptide repeats 5-like (IFIT9). However, several interferon-regulated antiviral genes which have previously been reported upregulated in PRV infected RBC in vivo (myxovirus resistance (Mx), interferon-stimulated gene 15 (ISG15), toll-like receptor 3 (TLR3)), were not significantly induced after 24h of virus stimulation. In contrast to RBC, these antiviral response genes were significantly upregulated in SHK-1. These results confirm that RBC are involved in the innate immune response to viruses, but with a delayed antiviral response compared to SHK-1. A notable difference is that interferon regulatory factor 1 (IRF-1) is the most strongly induced gene in RBC, but not among the significantly induced genes in SHK-1. Putative differences in the binding, recognition, and response to PRV, and any link to effects on the ability of PRV to replicate remains to be explored.Keywords: antiviral responses, atlantic salmon, piscine orthoreovirus-1, red blood cells, RNA-seq
Procedia PDF Downloads 19123313 Deriving Generic Transformation Matrices for Multi-Axis Milling Machine
Authors: Alan C. Lin, Tzu-Kuan Lin, Tsong Der Lin
Abstract:
This paper proposes a new method to find the equations of transformation matrix for the rotation angles of the two rotational axes and the coordinates of the three linear axes of an orthogonal multi-axis milling machine. This approach provides intuitive physical meanings for rotation angles of multi-axis machines, which can be used to evaluate the accuracy of the conversion from CL data to NC data.Keywords: CAM, multi-axis milling machining, transformation matrix, rotation angles
Procedia PDF Downloads 48323312 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models
Authors: Manisha Mukherjee, Diptarka Saha
Abstract:
Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function
Procedia PDF Downloads 16723311 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty
Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos
Abstract:
Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning
Procedia PDF Downloads 21123310 Decision-Making Strategies on Smart Dairy Farms: A Review
Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh
Abstract:
Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.Keywords: big data, evolutionary computing, cloud, precision technologies
Procedia PDF Downloads 19023309 Hydrothermal Energy Application Technology Using Dam Deep Water
Authors: Yooseo Pang, Jongwoong Choi, Yong Cho, Yongchae Jeong
Abstract:
Climate crisis, such as environmental problems related to energy supply, is getting emerged issues, so the use of renewable energy is essentially required to solve these problems, which are mainly managed by the Paris Agreement, the international treaty on climate change. The government of the Republic of Korea announced that the key long-term goal for a low-carbon strategy is “Carbon neutrality by 2050”. It is focused on the role of the internet data centers (IDC) in which large amounts of data, such as artificial intelligence (AI) and big data as an impact of the 4th industrial revolution, are managed. The demand for the cooling system market for IDC was about 9 billion US dollars in 2020, and 15.6% growth a year is expected in Korea. It is important to control the temperature in IDC with an efficient air conditioning system, so hydrothermal energy is one of the best options for saving energy in the cooling system. In order to save energy and optimize the operating conditions, it has been considered to apply ‘the dam deep water air conditioning system. Deep water at a specific level from the dam can supply constant water temperature year-round. It will be tested & analyzed the amount of energy saving with a pilot plant that has 100RT cooling capacity. Also, a target of this project is 1.2 PUE (Power Usage Effectiveness) which is the key parameter to check the efficiency of the cooling system.Keywords: hydrothermal energy, HVAC, internet data center, free-cooling
Procedia PDF Downloads 8223308 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam
Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen
Abstract:
In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks
Procedia PDF Downloads 21423307 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 27223306 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 21023305 The Impact of AI on Higher Education
Authors: Georges Bou Ghantous
Abstract:
This literature review examines the transformative impact of Artificial Intelligence (AI) on higher education, highlighting both the potential benefits and challenges associated with its adoption. The review reveals that AI significantly enhances personalized learning by tailoring educational experiences to individual student needs, thereby boosting engagement and learning outcomes. Automated grading systems streamline assessment processes, allowing educators to focus on improving instructional quality and student interaction. AI's data-driven insights provide valuable analytics, helping educators identify trends in at-risk students and refine teaching strategies. Moreover, AI promotes enhanced instructional innovation through the adoption of advanced teaching methods and technologies, enriching the educational environment. Administrative efficiency is also improved as AI automates routine tasks, freeing up time for educators to engage in research and curriculum development. However, the review also addresses the challenges that accompany AI integration, such as data privacy concerns, algorithmic bias, dependency on technology, reduced human interaction, and ethical dilemmas. This balanced exploration underscores the need for careful consideration of both the advantages and potential hurdles in the implementation of AI in higher education.Keywords: administrative efficiency, data-driven insights, data privacy, ethical dilemmas, higher education, personalized learning
Procedia PDF Downloads 2823304 Formulating a Definition of Hate Speech: From Divergence to Convergence
Authors: Avitus A. Agbor
Abstract:
Numerous incidents, ranging from trivial to catastrophic, do come to mind when one reflects on hate. The victims of these belong to specific identifiable groups within communities. These experiences evoke discussions on Islamophobia, xenophobia, homophobia, anti-Semitism, racism, ethnic hatred, atheism, and other brutal forms of bigotry. Common to all these is an invisible but portent force that drives all of them: hatred. Such hatred is usually fueled by a profound degree of intolerance (to diversity) and the zeal to impose on others their beliefs and practices which they consider to be the conventional norm. More importantly, the perpetuation of these hateful acts is the unfortunate outcome of an overplay of invectives and hate speech which, to a greater extent, cannot be divorced from hate. From a legal perspective, acknowledging the existence of an undeniable link between hate speech and hate is quite easy. However, both within and without legal scholarship, the notion of “hate speech” remains a conundrum: a phrase that is quite easily explained through experiences than propounding a watertight definition that captures the entire essence and nature of what it is. The problem is further compounded by a few factors: first, within the international human rights framework, the notion of hate speech is not used. In limiting the right to freedom of expression, the ICCPR simply excludes specific kinds of speeches (but does not refer to them as hate speech). Regional human rights instruments are not so different, except for the subsequent developments that took place in the European Union in which the notion has been carefully delineated, and now a much clearer picture of what constitutes hate speech is provided. The legal architecture in domestic legal systems clearly shows differences in approaches and regulation: making it more difficult. In short, what may be hate speech in one legal system may very well be acceptable legal speech in another legal system. Lastly, the cornucopia of academic voices on the issue of hate speech exude the divergence thereon. Yet, in the absence of a well-formulated and universally acceptable definition, it is important to consider how hate speech can be defined. Taking an evidence-based approach, this research looks into the issue of defining hate speech in legal scholarship and how and why such a formulation is of critical importance in the prohibition and prosecution of hate speech.Keywords: hate speech, international human rights law, international criminal law, freedom of expression
Procedia PDF Downloads 78