Search results for: random factor
6599 A Mathematical Analysis of a Model in Capillary Formation: The Roles of Endothelial, Pericyte and Macrophages in the Initiation of Angiogenesis
Authors: Serdal Pamuk, Irem Cay
Abstract:
Our model is based on the theory of reinforced random walks coupled with Michealis-Menten mechanisms which view endothelial cell receptors as the catalysts for transforming both tumor and macrophage derived tumor angiogenesis factor (TAF) into proteolytic enzyme which in turn degrade the basal lamina. The model consists of two main parts. First part has seven differential equations (DE’s) in one space dimension over the capillary, whereas the second part has the same number of DE’s in two space dimensions in the extra cellular matrix (ECM). We connect these two parts via some boundary conditions to move the cells into the ECM in order to initiate capillary formation. But, when does this movement begin? To address this question we estimate the thresholds that activate the transport equations in the capillary. We do this by using steady-state analysis of TAF equation under some assumptions. Once these equations are activated endothelial, pericyte and macrophage cells begin to move into the ECM for the initiation of angiogenesis. We do believe that our results play an important role for the mechanisms of cell migration which are crucial for tumor angiogenesis. Furthermore, we estimate the long time tendency of these three cells, and find that they tend to the transition probability functions as time evolves. We provide our numerical solutions which are in good agreement with our theoretical results.Keywords: angiogenesis, capillary formation, mathematical analysis, steady-state, transition probability function
Procedia PDF Downloads 1566598 Radiation Safety Factor of Education and Research Institution in Republic of Korea
Authors: Yeo Ryeong Jeon, Pyong Kon Cho, Eun Ok Han, Hyon Chul Jang, Yong Min Kim
Abstract:
This study surveyed on recognition related to radiation safety for radiation safety managers and workers those who have been worked in Republic of Korea education and research institution. At present, South Korea has no guideline and manual of radiation safety for education and research institution. Therefore, we tried to find an educational basis for development of radiation safety guideline and manual. To check the level of knowledge, attitude, and behavior about radiation safety, we used the questionnaire that consisted of 29 questions against knowledge, attitude and behavior, 4 questions against self-efficacy and expectation based on four factors (radiation source, human, organizational and physical environment) of the Haddon's matrix. Responses were collected between May 4 and June 30, 2015. We analyzed questionnaire by means of IBM SPSS/WIN 15 which well known as statistical package for social science. The data were compared with mean, standard deviation, Pearson's correlation, ANOVA (analysis of variance) and regression analysis. 180 copies of the questionnaire were returned from 60 workplaces. The overall mean results for behavior level was relatively lower than knowledge and attitude level. In particular, organizational environment factor on the radiation safety management indicated the lowest behavior level. Most of the factors were correlated in Pearson’s correlation analysis, especially between knowledge of human factors and behavior of human factors (Pearson’s correlation coefficient 0.809, P<.01). When analysis performed in line with the main radiation source type, institutions where have been used only opened RI (radioisotope) behavior level was the lowest among all subjects. Finally, knowledge of radiation source factor (β=0.556, P<.001) and human factor(β=0.376, P<.001) had the greatest impact in terms of behavior practice. Radiation safety managers and workers think positively about radiation safety management, but are poorly informed organizational environment of their institution. Thus, each institution need to efforts to settlement of radiation safety culture. Also, pedagogical interventions for improving knowledge on radiation safety needs in terms of safety accident prevention.Keywords: radiation safety management, factor analysis, SPSS, republic of Korea
Procedia PDF Downloads 3646597 Stock Prediction and Portfolio Optimization Thesis
Authors: Deniz Peksen
Abstract:
This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.Keywords: stock prediction, portfolio optimization, data science, machine learning
Procedia PDF Downloads 806596 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison
Procedia PDF Downloads 1036595 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014
Authors: Alexiou Dimitra, Fragkaki Maria
Abstract:
The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics
Procedia PDF Downloads 5116594 Accuracy of VCCT for Calculating Stress Intensity Factor in Metal Specimens Subjected to Bending Load
Authors: Sanjin Kršćanski, Josip Brnić
Abstract:
Virtual Crack Closure Technique (VCCT) is a method used for calculating stress intensity factor (SIF) of a cracked body that is easily implemented on top of basic finite element (FE) codes and as such can be applied on the various component geometries. It is a relatively simple method that does not require any special finite elements to be used and is usually used for calculating stress intensity factors at the crack tip for components made of brittle materials. This paper studies applicability and accuracy of VCCT applied on standard metal specimens containing trough thickness crack, subjected to an in-plane bending load. Finite element analyses were performed using regular 4-node, regular 8-node and a modified quarter-point 8-node 2D elements. Stress intensity factor was calculated from the FE model results for a given crack length, using data available from FE analysis and a custom programmed algorithm based on virtual crack closure technique. Influence of the finite element size on the accuracy of calculated SIF was also studied. The final part of this paper includes a comparison of calculated stress intensity factors with results obtained from analytical expressions found in available literature and in ASTM standard. Results calculated by this algorithm based on VCCT were found to be in good correlation with results obtained with mentioned analytical expressions.Keywords: VCCT, stress intensity factor, finite element analysis, 2D finite elements, bending
Procedia PDF Downloads 3056593 Evaluation of Osteoprotegrin (OPG) and Tumor Necrosis Factor A (TNF-A) Changes in Synovial Fluid and Serum in Dogs with Osteoarthritis; An Experimental Study
Authors: Behrooz Nikahval, Mohammad Saeed Ahrari-Khafi, Sakineh Behroozpoor, Saeed Nazifi
Abstract:
Osteoarthritis (OA) is a progressive and degenerative condition of the articular cartilage and other joints’ structures. It is essential to diagnose this condition as early as possible. The present research was performed to measure the Osteoprotegrin (OPG) and Tumor Necrosis Factor α (TNF-α) in synovial fluid and blood serum of dogs with surgically transected cruciate ligament as a model of OA, to evaluate if measuring of these parameters can be used as a way of early diagnosis of OA. In the present study, four mature, clinically healthy dogs were selected to investigate the effect of experimental OA, on OPG and TNF-α as a way of early detection of OA. OPG and TNF-α were measured in synovial fluid and blood serum on days 0, 14, 28, 90 and 180 after surgical transaction of cranial cruciate ligament in one stifle joint. Statistical analysis of the results showed that there was a significant increase in TNF-α in both synovial fluid and blood serum. OPG showed a decrease two weeks after OA induction. However, it fluctuated afterward. In conclusion, TNF-α could be used in both synovial fluid and blood serum as a way of early detection of OA; however, further research still needs to be conducted on OPG values in OA detection.Keywords: osteoarthritis, osteoprotegrin, tumor necrosis factor α, synovial fluid, serum, dog
Procedia PDF Downloads 3186592 Comparative Study between Angiotensin Converting Enzyme Inhibitors and Angiotensin Receptor Blockers on Ulcerative Colitis Induced Experimentally in Rats
Authors: Azza H. El-Medany, Hanan H. Hagar, Jamila H. El-Medany
Abstract:
Ulcerative colitis (UC) is one of chronic inflammatory diseases primarily affecting colon with unknown etiology. Some researches papers mentioned the possibility of the use of drugs that affect the angiotensin II in reducing the complication of ulcerative colitis. The aim of the present study is to evaluate the potential protective and therapeutic effects of captopril and valsartan on ulcerative colitis induced experimentally in rats using acetic acid. The results were assessed by histological assessment of colonic tissues and measurement of malondialdehyde (MDA), tumor necrosis factor (TNF-α), transforming growth factor (TGF-1B), angiotensin converting enzyme (ACE), reduced glutathione (GSH) and platelet activating factor (PAF) levels in colonic tissues. Oral pre-treatment with captopril or valsartan in a dose of 30 mgkg-1 body weight for 2 weeks before induction of colitis (prophylactic groups) and continuously for 2 weeks after induction (therapeutic groups) significantly reduce MDA, TNF-α, PAF, TGF-1B and ACE levels in colonic tissues as compared to acetic acid control group. Also, a significant increase in GSH level was observed in colonic tissues. Captopril and valsartan attenuated the macroscopic and microscopic colonic damage induced by acetic acid. These results suggest that either captopril or valsartan may be effective as prophylactic or treatment of UC through inhibition of ACE and scavenging effect on oxygen-derived free radicals.Keywords: captopril, valsartan, angiotensin converting enzyme, reduced glutathione, tumor necrosis factor
Procedia PDF Downloads 2696591 Descriptive Study of Role Played by Exercise and Diet on Brain Plasticity
Authors: Mridul Sharma, Praveen Saroha
Abstract:
In today's world, everyone has become so busy in their to-do tasks and daily routine that they tend to ignore some of the basal components of our life, including exercise and diet. This comparative study analyzes the pathways of the relationship between exercise and brain plasticity and also includes another variable diet to study the effects of diet on learning by answering questions including which diet is known to be the best learning supporter and what are the recommended quantities of the same. Further, this study looks into inter-relation between diet and exercise, and also some other approach of the relation between diet and exercise on learning apart from through Brain Derived Neurotrophic Factor (BDNF).Keywords: brain derived neurotrophic factor, brain plasticity, diet, exercise
Procedia PDF Downloads 1416590 Parameter Estimation for Contact Tracing in Graph-Based Models
Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar
Abstract:
We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference
Procedia PDF Downloads 776589 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 1226588 Role of von Willebrand Factor and ADAMTS13 In The Prediction of Thrombotic Complications In Patients With COVID-19
Authors: Nataliya V. Dolgushina, Elena A. Gorodnova, Olga S. Beznoshenco, Andrey Yu Romanov, Irina V. Menzhinskaya, Lyubov V. Krechetova, Gennady T. Suchich
Abstract:
In patients with COVID-19, generalized hypercoagulability can lead to the development of severe coagulopathy. This event is accompanied by the development of a pronounced inflammatory reaction. The observational prospective study included 39 patients with mild COVID-19 and 102 patients with moderate and severe COVID-19. Patients were then stratified into groups depending on the risk of venous thromboembolism. vWF to ADAMTS-13 concentrations and activity ratios were significantly higher in patients with a high venous thromboembolism risks in patients with moderate and severe forms COVID-19.Keywords: ADAMTS-13, COVID-19, hypercoagulation, thrombosis, von Willebrand factor
Procedia PDF Downloads 896587 Identifying Mitigation Plans in Reducing Usability Risk Using Delphi Method
Authors: Jayaletchumi T. Sambantha Moorthy, Suhaimi bin Ibrahim, Mohd Naz’ri Mahrin
Abstract:
Most quality models have defined usability as a significant factor that leads to improving product acceptability, increasing user satisfaction, improving product reliability, and also financially benefiting companies. Usability is also the best factor that acts as a balance for both the technical and human aspects of a software product, which is an important aspect in defining quality during software development process. A usability risk can be defined as a potential usability risk factor that a chosen action or activity may lead to a possible loss or an undesirable outcome. This could impact the usability of a software product thereby contributing to negative user experiences and causing a possible software product failure. Hence, it is important to mitigate and reduce usability risks in the software development process itself. By managing possible involved usability risks in software development process, failure of software product could be reduced. Therefore, this research uses the Delphi method to identify mitigation plans to reduce potential usability risks. The Delphi method is conducted with seven experts from the field of risk management and software development.Keywords: usability, usability risk, risk management, risk mitigation, delphi study
Procedia PDF Downloads 4666586 Application of Multilayer Perceptron and Markov Chain Analysis Based Hybrid-Approach for Predicting and Monitoring the Pattern of LULC Using Random Forest Classification in Jhelum District, Punjab, Pakistan
Authors: Basit Aftab, Zhichao Wang, Feng Zhongke
Abstract:
Land Use and Land Cover Change (LULCC) is a critical environmental issue that has significant effects on biodiversity, ecosystem services, and climate change. This study examines the spatiotemporal dynamics of land use and land cover (LULC) across a three-decade period (1992–2022) in a district area. The goal is to support sustainable land management and urban planning by utilizing the combination of remote sensing, GIS data, and observations from Landsat satellites 5 and 8 to provide precise predictions of the trajectory of urban sprawl. In order to forecast the LULCC patterns, this study suggests a hybrid strategy that combines the Random Forest method with Multilayer Perceptron (MLP) and Markov Chain analysis. To predict the dynamics of LULC change for the year 2035, a hybrid technique based on multilayer Perceptron and Markov Chain Model Analysis (MLP-MCA) was employed. The area of developed land has increased significantly, while the amount of bare land, vegetation, and forest cover have all decreased. This is because the principal land types have changed due to population growth and economic expansion. The study also discovered that between 1998 and 2023, the built-up area increased by 468 km² as a result of the replacement of natural resources. It is estimated that 25.04% of the study area's urbanization will be increased by 2035. The performance of the model was confirmed with an overall accuracy of 90% and a kappa coefficient of around 0.89. It is important to use advanced predictive models to guide sustainable urban development strategies. It provides valuable insights for policymakers, land managers, and researchers to support sustainable land use planning, conservation efforts, and climate change mitigation strategies.Keywords: land use land cover, Markov chain model, multi-layer perceptron, random forest, sustainable land, remote sensing.
Procedia PDF Downloads 336585 Development of an Experiment for Impedance Measurement of Structured Sandwich Sheet Metals by Using a Full Factorial Multi-Stage Approach
Authors: Florian Vincent Haase, Adrian Dierl, Anna Henke, Ralf Woll, Ennes Sarradj
Abstract:
Structured sheet metals and structured sandwich sheet metals are three-dimensional, lightweight structures with increased stiffness which are used in the automotive industry. The impedance, a figure of resistance of a structure to vibrations, will be determined regarding plain sheets, structured sheets, and structured sandwich sheets. The aim of this paper is generating an experimental design in order to minimize costs and duration of experiments. The design of experiments will be used to reduce the large number of single tests required for the determination of correlation between the impedance and its influencing factors. Full and fractional factorials are applied in order to systematize and plan the experiments. Their major advantages are high quality results given the relatively small number of trials and their ability to determine the most important influencing factors including their specific interactions. The developed full factorial experimental design for the study of plain sheets includes three factor levels. In contrast to the study of plain sheets, the respective impedance analysis used on structured sheets and structured sandwich sheets should be split into three phases. The first phase consists of preliminary tests which identify relevant factor levels. These factor levels are subsequently employed in main tests, which have the objective of identifying complex relationships between the parameters and the reference variable. Possible post-tests can follow up in case additional study of factor levels or other factors are necessary. By using full and fractional factorial experimental designs, the required number of tests is reduced by half. In the context of this paper, the benefits from the application of design for experiments are presented. Furthermore, a multistage approach is shown to take into account unrealizable factor combinations and minimize experiments.Keywords: structured sheet metals, structured sandwich sheet metals, impedance measurement, design of experiment
Procedia PDF Downloads 3746584 De Broglie Wavelength Defined by the Rest Energy E0 and Its Velocity
Authors: K. Orozović, B. Balon
Abstract:
In this paper, we take a different approach to de Broglie wavelength, as we relate it to relativistic physics. The quantum energy of the photon radiated by a body with de Broglie wavelength, as it moves with velocity v, can be defined within relativistic physics by rest energy E₀. In this way, we can show the connection between the quantum of radiation energy of the body and the rest of energy E₀ and thus combine what has been incompatible so far, namely relativistic and quantum physics. So, here we discuss the unification of relativistic and quantum physics by introducing the factor k that is analog to the Lorentz factor in Einstein's theory of relativity.Keywords: de Brogli wavelength, relativistic physics, rest energy, quantum physics
Procedia PDF Downloads 1566583 The Sensitivity of Credit Defaults Swaps Premium to Global Risk Factor: Evidence from Emerging Markets
Authors: Oguzhan Cepni, Doruk Kucuksarac, M. Hasan Yilmaz
Abstract:
Changes in the global risk appetite cause co-movement in emerging market risk premiums. However, the sensitivity of the changes in risk premium to the global risk appetite may vary across emerging markets. In this study, how the global risk appetite affects Credit Default Swap (CDS) premiums in emerging markets are analyzed using Principal Component Analysis (PCA) and rolling regressions. The PCA results indicate that the first common component derived by the PCA accounts for almost 76 percent of the common variation in CDS premiums. Additionally, the explanatory power of the first factor seems to be high over the sample period. However, the sensitivity to the global risk factor tends to change over time and across countries. In this regard, fixed effects panel regressions are used to identify the macroeconomic factors driving the heterogeneity across emerging markets. The panel regression results point to the significance of government debt to GDP and international reserves to GDP in explaining sensitivity. Accordingly, countries with lower government debt and higher reserves tend to be less subject to the variations in the global risk appetite.Keywords: credit default swaps, emerging markets, principal components analysis, sovereign risk
Procedia PDF Downloads 3786582 Thermohydraulic Performance Comparison of Artificially Roughened Rectangular Channels
Authors: Narender Singh Thakur, Sunil Chamoli
Abstract:
The use of roughness geometry in the rectangular channel duct is an effective technique to enhance the rate of heat transfer to the working fluid. The present research concentrates on the performance comparison of a rectangular channel with different roughness geometry of the test plate. The performance enhancement is compared by considering the statistical correlations developed by the various investigators for Nusselt number and friction factor. Among all the investigated geometries multiple v-shaped rib roughened rectangular channel found thermo hydraulically better than other investigated geometries under similar current and operating conditions.Keywords: nusselt number, friction factor, thermohydraulic, performance parameter
Procedia PDF Downloads 4226581 Modeling Biomass and Biodiversity across Environmental and Management Gradients in Temperate Grasslands with Deep Learning and Sentinel-1 and -2
Authors: Javier Muro, Anja Linstadter, Florian Manner, Lisa Schwarz, Stephan Wollauer, Paul Magdon, Gohar Ghazaryan, Olena Dubovyk
Abstract:
Monitoring the trade-off between biomass production and biodiversity in grasslands is critical to evaluate the effects of management practices across environmental gradients. New generations of remote sensing sensors and machine learning approaches can model grasslands’ characteristics with varying accuracies. However, studies often fail to cover a sufficiently broad range of environmental conditions, and evidence suggests that prediction models might be case specific. In this study, biomass production and biodiversity indices (species richness and Fishers’ α) are modeled in 150 grassland plots for three sites across Germany. These sites represent a North-South gradient and are characterized by distinct soil types, topographic properties, climatic conditions, and management intensities. Predictors used are derived from Sentinel-1 & 2 and a set of topoedaphic variables. The transferability of the models is tested by training and validating at different sites. The performance of feed-forward deep neural networks (DNN) is compared to a random forest algorithm. While biomass predictions across gradients and sites were acceptable (r2 0.5), predictions of biodiversity indices were poor (r2 0.14). DNN showed higher generalization capacity than random forest when predicting biomass across gradients and sites (relative root mean squared error of 0.5 for DNN vs. 0.85 for random forest). DNN also achieved high performance when using the Sentinel-2 surface reflectance data rather than different combinations of spectral indices, Sentinel-1 data, or topoedaphic variables, simplifying dimensionality. This study demonstrates the necessity of training biomass and biodiversity models using a broad range of environmental conditions and ensuring spatial independence to have realistic and transferable models where plot level information can be upscaled to landscape scale.Keywords: ecosystem services, grassland management, machine learning, remote sensing
Procedia PDF Downloads 2186580 Anemia and Nutritional Status as Dominant Factor of the Event Low Birth Weight in Indonesia: A Systematic Review
Authors: Lisnawati Hutagalung
Abstract:
Background: Low birth weight (LBW) is one cause of newborn death. Babies with low birth weight tend to have slower cognitive development, growth retardation, more at risk of infectious disease event at risk of death. Objective: Identifying risk factors and dominant factors that influence the incidence of LBW in Indonesia. Method: This research used some database of public health such as Google Scholar, UGM journals, UI journals and UNAND journals in 2012-2015. Data were filtered using keywords ‘Risk Factors’ AND ‘Cause LBW’ with amounts 2757 study. The filtrate obtained 5 public health research that meets the criteria. Results: Risk factors associated with LBW, among other environment factors (exposure to cigarette smoke and residence), social demographics (age and socio-economic) and maternal factors (anemia, placental abnormal, nutritional status of mothers, examinations antenatal, preeclampsia, parity, and complications in pregnancy). Anemia and nutritional status become the dominant factor affecting LBW. Conclusions: The risk factors that affect LBW, most commonly found in the maternal factors. The dominant factors are a big effect on LBW is anemia and nutritional status of the mother during pregnancy.Keywords: low birth weight, anemia, nutritional status, the dominant factor
Procedia PDF Downloads 3656579 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models
Authors: Jay L. Fu
Abstract:
Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction
Procedia PDF Downloads 1436578 Enhanced Test Scheme based on Programmable Write Time for Future Computer Memories
Authors: Nor Zaidi Haron, Fauziyah Salehuddin, Norsuhaidah Arshad, Sani Irwan Salim
Abstract:
Resistive random access memories (RRAMs) are one of the main candidates for future computer memories. However, due to their tiny size and immature device technology, the quality of the outgoing RRAM chips is seen as a serious issue. Defective RRAM cells might behave differently than existing semiconductor memories (Dynamic RAM, Static RAM, and Flash), meaning that they are difficult to be detected using existing test schemes. This paper presents an enhanced test scheme, referred to as Programmable Short Write Time (PSWT) that is able to improve the detection of faulty RRAM cells. It is developed by applying multiple weak write operations, each with different time durations. The test circuit embedded in the RRAM chip is made programmable in order to supply different weak write times during testing. The RRAM electrical model is described using Verilog-AMS language and is simulated using HSPICE simulation tools. Simulation results show that the proposed test scheme offers better open-resistive fault detection compared to existing test schemes.Keywords: memory fault, memory test, design-for-testability, resistive random access memory
Procedia PDF Downloads 3876577 The Impact of Social Support on Anxiety and Depression under the Context of COVID-19 Pandemic: A Scoping Review and Meta-Analysis
Authors: Meng Wu, Atif Rahman, Eng Gee, Lim, Jeong Jin Yu, Rong Yan
Abstract:
Context: The COVID-19 pandemic has had a profound impact on mental health, with increased rates of anxiety and depression observed. Social support, a critical factor in mental well-being, has also undergone significant changes during the pandemic. This study aims to explore the relationship between social support, anxiety, and depression during COVID-19, taking into account various demographic and contextual factors. Research Aim: The main objective of this study is to conduct a comprehensive systematic review and meta-analysis to examine the impact of social support on anxiety and depression during the COVID-19 pandemic. The study aims to determine the consistency of these relationships across different age groups, occupations, regions, and research paradigms. Methodology: A scoping review and meta-analytic approach were employed in this study. A search was conducted across six databases from 2020 to 2022 to identify relevant studies. The selected studies were then subjected to random effects models, with pooled correlations (r and ρ) estimated. Homogeneity was assessed using Q and I² tests. Subgroup analyses were conducted to explore variations across different demographic and contextual factors. Findings: The meta-analysis of both cross-sectional and longitudinal studies revealed significant correlations between social support, anxiety, and depression during COVID-19. The pooled correlations (ρ) indicated a negative relationship between social support and anxiety (ρ = -0.30, 95% CI = [-0.333, -0.255]) as well as depression (ρ = -0.27, 95% CI = [-0.370, -0.281]). However, further investigation is required to validate these results across different age groups, occupations, and regions. Theoretical Importance: This study emphasizes the multifaceted role of social support in mental health during the COVID-19 pandemic. It highlights the need to reevaluate and expand our understanding of social support's impact on anxiety and depression. The findings contribute to the existing literature by shedding light on the associations and complexities involved in these relationships. Data Collection and Analysis Procedures: The data collection involved an extensive search across six databases to identify relevant studies. The selected studies were then subjected to rigorous analysis using random effects models and subgroup analyses. Pooled correlations were estimated, and homogeneity was assessed using Q and I² tests. Question Addressed: This study aimed to address the question of the impact of social support on anxiety and depression during the COVID-19 pandemic. It sought to determine the consistency of these relationships across different demographic and contextual factors. Conclusion: The findings of this study highlight the significant association between social support, anxiety, and depression during the COVID-19 pandemic. However, further research is needed to validate these findings across different age groups, occupations, and regions. The study emphasizes the need for a comprehensive understanding of social support's multifaceted role in mental health and the importance of considering various contextual and demographic factors in future investigations.Keywords: social support, anxiety, depression, COVID-19, meta-analysis
Procedia PDF Downloads 626576 Conscious Intention-based Processes Impact the Neural Activities Prior to Voluntary Action on Reinforcement Learning Schedules
Authors: Xiaosheng Chen, Jingjing Chen, Phil Reed, Dan Zhang
Abstract:
Conscious intention can be a promising point cut to grasp consciousness and orient voluntary action. The current study adopted a random ratio (RR), yoked random interval (RI) reinforcement learning schedule instead of the previous highly repeatable and single decision point paradigms, aimed to induce voluntary action with the conscious intention that evolves from the interaction between short-range-intention and long-range-intention. Readiness potential (RP) -like-EEG amplitude and inter-trial-EEG variability decreased significantly prior to voluntary action compared to cued action for inter-trial-EEG variability, mainly featured during the earlier stage of neural activities. Notably, (RP) -like-EEG amplitudes decreased significantly prior to higher RI-reward rates responses in which participants formed a higher plane of conscious intention. The present study suggests the possible contribution of conscious intention-based processes to the neural activities from the earlier stage prior to voluntary action on reinforcement leanring schedule.Keywords: Reinforcement leaning schedule, voluntary action, EEG, conscious intention, readiness potential
Procedia PDF Downloads 786575 Stochastic Modeling and Productivity Analysis of a Flexible Manufacturing System
Authors: Mehmet Savsar, Majid Aldaihani
Abstract:
Flexible Manufacturing Systems (FMS) are used to produce a variety of parts on the same equipment. Therefore, their utilization is higher than traditional machining systems. Higher utilization, on the other hand, results in more frequent equipment failures and additional need for maintenance. Therefore, it is necessary to carefully analyze operational characteristics and productivity of FMS or Flexible Manufacturing Cells (FMC), which are smaller configuration of FMS, before installation or during their operation. Appropriate models should be developed to determine production rates based on operational conditions, including equipment reliability, availability, and repair capacity. In this paper, a stochastic model is developed for an automated FMC system, which consists of two machines served by two robots and a single repairman. The model is used to determine system productivity and equipment utilization under different operational conditions, including random machine failures, random repairs, and limited repair capacity. The results are compared to previous study results for FMC system with sufficient repair capacity assigned to each machine. The results show that the model will be useful for design engineers and operational managers to analyze performance of manufacturing systems at the design or operational stages.Keywords: flexible manufacturing, FMS, FMC, stochastic modeling, production rate, reliability, availability
Procedia PDF Downloads 5166574 Analysis of 3 dB Directional Coupler Based On Silicon-On-Insulator (SOI) Large Cross-Section Rib Waveguide
Authors: Nurdiani Zamhari, Abang Annuar Ehsan
Abstract:
The 3 dB directional coupler is designed by using silicon-on-insulator (SOI) large cross-section and simulate by Beam Propagation Method at the communication wavelength of 1.55 µm and 1.48 µm. The geometry is shaped with rib height (H) of 6 µm and varied in step factor (r) which is 0.5, 0.6, 0.7 and 0.8. The wave guide spacing is also fixed to 5 µm and the slab width is symmetrical. In general, the 3 dB coupling lengths for four different cross-sections are several millimetre long. The 1.48 of wavelength give the longer coupling length if compare to 1.55 at the same step factor (r). Besides, the low loss propagation is achieved with less than 2 % of propagation loss.Keywords: 3 dB directional couplers, silicon-on-insulator, symmetrical rib waveguide, OptiBPM 9
Procedia PDF Downloads 5166573 Improving the Quality of Staff Performance with a Talent-Driven Approach: Case Study of SAIPA Automotive Manufacturing Company in Iran
Authors: Abdolmajid Mosleh, Afzal Ghasimi
Abstract:
The purpose of this research is to investigate and identify effective factors that can improve the quality of personal performance in industrial companies. In the present study, it was assumed that the hidden variables of talent management could be explained by an important part of the variance in improving the quality of employee performance. This research is targeted in terms of applied research. The statistical population of the research is SAIPA automobile company with a number (N=10291); the sample of 380 people was selected based on the Cochran formula in a random sampling method among employed people. The measurement tool in this research was a questionnaire of 33 items with a control questionnaire that included two talent management departments (talent identification and talent exploitation) and improvements in staff performance (enhancement of technical and specialized capabilities, managerial capability, organizational interaction, and communication). The reliability of the internal consistency method was confirmed by the Cronbach's alpha coefficient and the two half-ways. In order to determine the validity of the questionnaire structure, confirmatory factor analysis was used. Based on the results of the data analysis, the effect of talent management on improving the quality of staff performance was confirmed. Based on the results of inferential statistics and structural equations of the proposed model, it had high fitness.Keywords: employee performance, talent management, performance improvement, SAIPA automobile manufacturing company
Procedia PDF Downloads 906572 Analysis and Design of Offshore Triceratops under Ultra-Deep Waters
Authors: Srinivasan Chandrasekaran, R. Nagavinothini
Abstract:
Offshore platforms for ultra-deep waters are form-dominant by design; hybrid systems with large flexibility in horizontal plane and high rigidity in vertical plane are preferred due to functional complexities. Offshore triceratops is relatively a new-generation offshore platform, whose deck is partially isolated from the supporting buoyant legs by ball joints. They allow transfer of partial displacements of buoyant legs to the deck but restrain transfer of rotational response. Buoyant legs are in turn taut-moored to the sea bed using pre-tension tethers. Present study will discuss detailed dynamic analysis and preliminary design of the chosen geometric, which is necessary as a proof of validation for such design applications. A detailed numeric analysis of triceratops at 2400 m water depth under random waves is presented. Preliminary design confirms member-level design requirements under various modes of failure. Tether configuration, proposed in the study confirms no pull-out of tethers as stress variation is comparatively lesser than the yield value. Presented study shall aid offshore engineers and contractors to understand suitability of triceratops, in terms of design and dynamic response behaviour.Keywords: offshore structures, triceratops, random waves, buoyant legs, preliminary design, dynamic analysis
Procedia PDF Downloads 2046571 Design of Reinforced Concrete (RC) Walls Considering Shear Amplification by Nonlinear Dynamic Behavior
Authors: Sunghyun Kim, Hong-Gun Park
Abstract:
In the performance-based design (PBD), by using the nonlinear dynamic analysis (NDA), the actual performance of the structure is evaluated. Unlike frame structures, in the wall structures, base shear force which is resulted from the NDA, is greatly amplified than that from the elastic analysis. This shear amplifying effect causes repeated designs which make designer difficult to apply the PBD. Therefore, in this paper, factors which affect shear amplification were studied. For the 20-story wall model, the NDA was performed. From the analysis results, the base shear amplification factor was proposed.Keywords: performance based design, shear amplification factor, nonlinear dynamic analysis, RC shear wall
Procedia PDF Downloads 3786570 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 11