Search results for: adaptive soc estimation
1112 Clarifying the Possible Symptomatic Pathway of Comorbid Depression, Anxiety, and Stress Among Adolescents Exposed to Childhood Trauma: Insight from the Network Approach
Authors: Xinyuan Zou, Qihui Tang, Shujian Wang, Yulin Huang, Jie Gui, Xiangping Liu, Gang Liu, Yanqiang Tao
Abstract:
Childhood trauma can have a long-lasting influence on individuals and contribute to mental disorders, including depression and anxiety. The current study aimed to explore the symptomatic and developmental patterns of depression, anxiety, and stress among adolescents who have suffered from childhood trauma. A total of 3,598 college students (female = 1,617 (44.94%), Mean Age = 19.68, SD Age = 1.35) in China completed the Childhood Trauma Questionnaire (CTQ) and the Depression, Anxiety, and Stress Scales (DASS-21), and 2,337 participants met the selection standard based on the cut-off scores of the CTQ. The symptomatic network and directed acyclic graph (DAG) network approaches were used. The results revealed that males reported experiencing significantly more physical abuse, physical neglect, emotional neglect, and sexual abuse compared to females. However, females scored significantly higher than males on all items of DASS-21, except for “Worthless”. No significant difference between the two genders was observed in the network structure and global strength. Meanwhile, among all participants, “Down-hearted” and “Agitated” appeared to be the most interconnected symptoms, the bridge symptoms in the symptom network, as well as the most vital symptoms in the DAG network. Apart from that, “No-relax” also served as the most prominent symptom in the DAG network. The results suggested that intervention targeted at assisting adolescents in developing more adaptive coping strategies with stress and regulating emotion could benefit the alleviation of comorbid depression, anxiety, and stress.Keywords: symptom network, childhood trauma, depression, anxiety, stress
Procedia PDF Downloads 621111 Generalized Extreme Value Regression with Binary Dependent Variable: An Application for Predicting Meteorological Drought Probabilities
Authors: Retius Chifurira
Abstract:
Logistic regression model is the most used regression model to predict meteorological drought probabilities. When the dependent variable is extreme, the logistic model fails to adequately capture drought probabilities. In order to adequately predict drought probabilities, we use the generalized linear model (GLM) with the quantile function of the generalized extreme value distribution (GEVD) as the link function. The method maximum likelihood estimation is used to estimate the parameters of the generalized extreme value (GEV) regression model. We compare the performance of the logistic and the GEV regression models in predicting drought probabilities for Zimbabwe. The performance of the regression models are assessed using the goodness-of-fit tests, namely; relative root mean square error (RRMSE) and relative mean absolute error (RMAE). Results show that the GEV regression model performs better than the logistic model, thereby providing a good alternative candidate for predicting drought probabilities. This paper provides the first application of GLM derived from extreme value theory to predict drought probabilities for a drought-prone country such as Zimbabwe.Keywords: generalized extreme value distribution, general linear model, mean annual rainfall, meteorological drought probabilities
Procedia PDF Downloads 2011110 Hybridized Approach for Distance Estimation Using K-Means Clustering
Authors: Ritu Vashistha, Jitender Kumar
Abstract:
Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.Keywords: ant colony optimization, data clustering, centroids, data mining, k-means
Procedia PDF Downloads 1281109 Estimation of Population Mean Using Characteristics of Poisson Distribution: An Application to Earthquake Data
Authors: Prayas Sharma
Abstract:
This paper proposed a generalized class of estimators, an exponential class of estimators based on the adaption of Sharma and Singh (2015) and Solanki and Singh (2013), and a simple difference estimator for estimating unknown population mean in the case of Poisson distributed population in simple random sampling without replacement. The expressions for mean square errors of the proposed classes of estimators are derived from the first order of approximation. It is shown that the adapted version of Solanki and Singh (2013), the exponential class of estimator, is always more efficient than the usual estimator, ratio, product, exponential ratio, and exponential product type estimators and equally efficient to simple difference estimator. Moreover, the adapted version of Sharma and Singh's (2015) estimator is always more efficient than all the estimators available in the literature. In addition, theoretical findings are supported by an empirical study to show the superiority of the constructed estimators over others with an application to earthquake data of Turkey.Keywords: auxiliary attribute, point bi-serial, mean square error, simple random sampling, Poisson distribution
Procedia PDF Downloads 1561108 Vulnerability Risk Assessment of Non-Engineered Houses Based on Damage Data of the 2009 Padang Earthquake 2009 in Padang City, Indonesia
Authors: Rusnardi Rahmat Putra, Junji Kiyono, Aiko Furukawa
Abstract:
Several powerful earthquakes have struck Padang during recent years, one of the largest of which was an M 7.6 event that occurred on September 30, 2009 and caused more than 1000 casualties. Following the event, we conducted a 12-site microtremor array investigation to gain a representative determination of the soil condition of subsurface structures in Padang. From the dispersion curve of array observations, the central business district of Padang corresponds to relatively soft soil condition with Vs30 less than 400 m/s. because only one accelerometer existed, we simulated the 2009 Padang earthquake to obtain peak ground acceleration for all sites in Padang city. By considering the damage data of the 2009 Padang earthquake, we produced seismic risk vulnerability estimation of non-engineered houses for rock, medium and soft soil condition. We estimated the loss ratio based on the ground response, seismic hazard of Padang and the existing damaged to non-engineered structure houses due to Padang earthquake in 2009 data for several return periods of earthquake events.Keywords: profile, Padang earthquake, microtremor array, seismic vulnerability
Procedia PDF Downloads 4101107 Modeling of Coupled Mechanical State and Diffusion in Composites with Impermeable Fibers
Authors: D. Gueribiz, F. Jacquemin, S. Fréour
Abstract:
During their service life, composite materials are submitted to humid environments. The moisture absorbed by their matrix polymer induced internal stresses which can lead to multi-scale damage and may reduce the lifetime of composite structures. The estimation of internal stresses is based at a first on realistic evaluation of the diffusive behavior of composite materials. Generally, the modeling and simulation of the diffusive behavior of composite materials are extensively investigated through decoupled models based on the assumption of Fickien behavior. For these approaches, the concentration and the deformation (or stresses), the two state variables of the problem considered are governed by independent equations which are solved separately. In this study, a model coupling diffusive behavior with stresses state for a polymer matrix composite reinforced with impermeable fibers is proposed, the investigation of diffusive behavior is based on a more general thermodynamic approach which introduces a dependence of diffusive behavior on internal stresses state. The coupled diffusive behavior modeling was established in first for homogeneous and isotropic matrix and it is, thereafter, extended to impermeable unidirectional composites.Keywords: composites materials, moisture diffusion, effective moisture diffusivity, coupled moisture diffusion
Procedia PDF Downloads 3091106 Statistical Analysis of Rainfall Change over the Blue Nile Basin
Authors: Hany Mustafa, Mahmoud Roushdi, Khaled Kheireldin
Abstract:
Rainfall variability is an important feature of semi-arid climates. Climate change is very likely to increase the frequency, magnitude, and variability of extreme weather events such as droughts, floods, and storms. The Blue Nile Basin is facing extreme climate change-related events such as floods and droughts and its possible impacts on ecosystem, livelihood, agriculture, livestock, and biodiversity are expected. Rainfall variability is a threat to food production in the Blue Nile Basin countries. This study investigates the long-term variations and trends of seasonal and annual precipitation over the Blue Nile Basin for 102-year period (1901-2002). Six statistical trend analysis of precipitation was performed with nonparametric Mann-Kendall test and Sen's slope estimator. On the other hands, four statistical absolute homogeneity tests: Standard Normal Homogeneity Test, Buishand Range test, Pettitt test and the Von Neumann ratio test were applied to test the homogeneity of the rainfall data, using XLSTAT software, which results of p-valueless than alpha=0.05, were significant. The percentages of significant trends obtained for each parameter in the different seasons are presented. The study recommends adaptation strategies to be streamlined to relevant policies, enhancing local farmers’ adaptive capacity for facing future climate change effects.Keywords: Blue Nile basin, climate change, Mann-Kendall test, trend analysis
Procedia PDF Downloads 5511105 Fuzzy Multi-Objective Approach for Emergency Location Transportation Problem
Authors: Bidzina Matsaberidze, Anna Sikharulidze, Gia Sirbiladze, Bezhan Ghvaberidze
Abstract:
In the modern world emergency management decision support systems are actively used by state organizations, which are interested in extreme and abnormal processes and provide optimal and safe management of supply needed for the civil and military facilities in geographical areas, affected by disasters, earthquakes, fires and other accidents, weapons of mass destruction, terrorist attacks, etc. Obviously, these kinds of extreme events cause significant losses and damages to the infrastructure. In such cases, usage of intelligent support technologies is very important for quick and optimal location-transportation of emergency service in order to avoid new losses caused by these events. Timely servicing from emergency service centers to the affected disaster regions (response phase) is a key task of the emergency management system. Scientific research of this field takes the important place in decision-making problems. Our goal was to create an expert knowledge-based intelligent support system, which will serve as an assistant tool to provide optimal solutions for the above-mentioned problem. The inputs to the mathematical model of the system are objective data, as well as expert evaluations. The outputs of the system are solutions for Fuzzy Multi-Objective Emergency Location-Transportation Problem (FMOELTP) for disasters’ regions. The development and testing of the Intelligent Support System were done on the example of an experimental disaster region (for some geographical zone of Georgia) which was generated using a simulation modeling. Four objectives are considered in our model. The first objective is to minimize an expectation of total transportation duration of needed products. The second objective is to minimize the total selection unreliability index of opened humanitarian aid distribution centers (HADCs). The third objective minimizes the number of agents needed to operate the opened HADCs. The fourth objective minimizes the non-covered demand for all demand points. Possibility chance constraints and objective constraints were constructed based on objective-subjective data. The FMOELTP was constructed in a static and fuzzy environment since the decisions to be made are taken immediately after the disaster (during few hours) with the information available at that moment. It is assumed that the requests for products are estimated by homeland security organizations, or their experts, based upon their experience and their evaluation of the disaster’s seriousness. Estimated transportation times are considered to take into account routing access difficulty of the region and the infrastructure conditions. We propose an epsilon-constraint method for finding the exact solutions for the problem. It is proved that this approach generates the exact Pareto front of the multi-objective location-transportation problem addressed. Sometimes for large dimensions of the problem, the exact method requires long computing times. Thus, we propose an approximate method that imposes a number of stopping criteria on the exact method. For large dimensions of the FMOELTP the Estimation of Distribution Algorithm’s (EDA) approach is developed.Keywords: epsilon-constraint method, estimation of distribution algorithm, fuzzy multi-objective combinatorial programming problem, fuzzy multi-objective emergency location/transportation problem
Procedia PDF Downloads 3211104 Fast Prediction Unit Partition Decision and Accelerating the Algorithm Using Cudafor Intra and Inter Prediction of HEVC
Authors: Qiang Zhang, Chun Yuan
Abstract:
Since the PU (Prediction Unit) decision process is the most time consuming part of the emerging HEVC (High Efficient Video Coding) standardin intra and inter frame coding, this paper proposes the fast PU decision algorithm and speed up the algorithm using CUDA (Compute Unified Device Architecture). In intra frame coding, the fast PU decision algorithm uses the texture features to skip intra-frame prediction or terminal the intra-frame prediction for smaller PU size. In inter frame coding of HEVC, the fast PU decision algorithm takes use of the similarity of its own two Nx2N size PU's motion vectors and the hierarchical structure of CU (Coding Unit) partition to skip some modes of PU partition, so as to reduce the motion estimation times. The accelerate algorithm using CUDA is based on the fast PU decision algorithm which uses the GPU to make the motion search and the gradient computation could be parallel computed. The proposed algorithm achieves up to 57% time saving compared to the HM 10.0 with little rate-distortion losses (0.043dB drop and 1.82% bitrate increase on average).Keywords: HEVC, PU decision, inter prediction, intra prediction, CUDA, parallel
Procedia PDF Downloads 3991103 Developing Proof Demonstration Skills in Teaching Mathematics in the Secondary School
Authors: M. Rodionov, Z. Dedovets
Abstract:
The article describes the theoretical concept of teaching secondary school students proof demonstration skills in mathematics. It describes in detail different levels of mastery of the concept of proof-which correspond to Piaget’s idea of there being three distinct and progressively more complex stages in the development of human reflection. Lessons for each level contain a specific combination of the visual-figurative components and deductive reasoning. It is vital at the transition point between levels to carefully and rigorously recalibrate teaching to reflect the development of more complex reflective understanding. This can apply even within the same age range, since students will develop at different speeds and to different potential. The authors argue that this requires an aware and adaptive approach to lessons to reflect this complexity and variation. The authors also contend that effective teaching which enables students to properly understand the implementation of proof arguments must develop specific competences. These are: understanding of the importance of completeness and generality in making a valid argument; being task focused; having an internalised locus of control and being flexible in approach and evaluation. These criteria must be correlated with the systematic application of corresponding methodologies which are best likely to achieve success. The particular pedagogical decisions which are made to deliver this objective are illustrated by concrete examples from the existing secondary school mathematics courses. The proposed theoretical concept formed the basis of the development of methodological materials which have been tested in 47 secondary schools.Keywords: education, teaching of mathematics, proof, deductive reasoning, secondary school
Procedia PDF Downloads 2421102 Climate Adaptability of Vernacular Courtyards in Jiangnan Area, Southeast China
Authors: Yu Bingqing
Abstract:
Research on the meteorological observation data of conventional meteorological stations in Jiangnan area from 2001 to 2020 and digital elevation DEM, the "golden section" comfort index calculation method was used to refine the spatial estimation of climate comfort in Jiangnan area under undulating terrain on the Gis platform, and its spatiotemporal distribution characteristics in the region were analyzed. The results can provide reference for the development and utilization of climate resources in Jiangnan area.The results show that: ① there is a significant spatial difference between winter and summer climate comfort from low latitude to high latitude. ②There is a significant trend of decreasing climate comfort from low altitude to high altitude in winter, but the opposite is true in summer. ③There is a trend of decreasing climate comfort from offshore to inland in winter, but the difference is not significant in summer. The climate comfort level in the natural lake area is higher in summer than in the surrounding areas, but not in winter. ⑤ In winter and summer, altitude has the greatest influence on the difference in comfort level.Keywords: vernacular courtyards, thermal environment, depth-to-height ratio, climate adaptability,Southeast China
Procedia PDF Downloads 591101 Estimation of Seismic Drift Demands for Inelastic Shear Frame Structures
Authors: Ali Etemadi, Polat H. Gulkan
Abstract:
The drift spectrum derived through the continuous shear-beam and wave propagation theory is known to be useful appliance to measure of the demand of pulse like near field ground motions on building structures. As regards, many of old frame buildings with poor or non-ductile column elements, pass the elastic limits and blurt the post yielding hysteresis degradation responses when subjected to such impulsive ground motions. The drift spectrum which, is based on a linear system cannot be predicted the overestimate drift demands arising from inelasticity in an elastic plastic systems. A simple procedure to estimate the drift demands in shear-type frames which, respond over the elastic limits is described and effect of hysteresis degradation behavior on seismic demands is clarified. Whereupon the modification factors are proposed to incorporate the hysteresis degradation effects parametrically. These factors are defined with respected to the linear systems. The method can be applicable for rapid assessment of existing poor detailed, non-ductile buildings.Keywords: drift spectrum, shear-type frame, stiffness and strength degradation, pinching, smooth hysteretic model, quasi static analysis
Procedia PDF Downloads 5261100 The Application of AI in Developing Assistive Technologies for Non-Verbal Individuals with Autism
Authors: Ferah Tesfaye Admasu
Abstract:
Autism Spectrum Disorder (ASD) often presents significant communication challenges, particularly for non-verbal individuals who struggle to express their needs and emotions effectively. Assistive technologies (AT) have emerged as vital tools in enhancing communication abilities for this population. Recent advancements in artificial intelligence (AI) hold the potential to revolutionize the design and functionality of these technologies. This study explores the application of AI in developing intelligent, adaptive, and user-centered assistive technologies for non-verbal individuals with autism. Through a review of current AI-driven tools, including speech-generating devices, predictive text systems, and emotion-recognition software, this research investigates how AI can bridge communication gaps, improve engagement, and support independence. Machine learning algorithms, natural language processing (NLP), and facial recognition technologies are examined as core components in creating more personalized and responsive communication aids. The study also discusses the challenges and ethical considerations involved in deploying AI-based AT, such as data privacy and the risk of over-reliance on technology. Findings suggest that integrating AI into assistive technologies can significantly enhance the quality of life for non-verbal individuals with autism, providing them with greater opportunities for social interaction and participation in daily activities. However, continued research and development are needed to ensure these technologies are accessible, affordable, and culturally sensitive.Keywords: artificial intelligence, autism spectrum disorder, non-verbal communication, assistive technology, machine learning
Procedia PDF Downloads 221099 Understanding Mental Constructs of Language and Emotion
Authors: Sakshi Ghai
Abstract:
The word ‘emotion’ has been microscopically studied through psychological, anthropological and biological lenses and have indubitably been one of the most researched concepts as, in all situations and reactions that constitute human life, emotions form the very niche of our mutual existence. While understanding the social aspects of cognition, one can realize that emotions are deeply interwoven with language and thereby are pivotal in inducing human actions and behavior. The society or the outward social structure is the result of the inward psychological structure of our human relationships, for the individual is the result of the total experience, knowledge and conduct of man. The aim of this paper is threefold: first, to establish the relation between mental representations of emotions and its neuropsychological connection with language on a conscious and sub-conscious level; secondly, to describe how innate, basic and higher cognitive emotions affect the constantly changing state of an agent and peruse its assistance in determining the moral compass within all beings. Lastly, in the course of this paper, the concept of the architecture of mind is explored considering how it has developed an ability to display adaptive emotional states and responses, which are in sync with the language of thought. For every response to the social environment is so deeply determined by the very social milieu in which one is situated, language has a fundamental role in constructing emotions and articulating behavior. Being linguistic beings, we tend to associate emotion, feelings and other aspects of inwards mental states intrinsically with the language we use. This paper aims to devise a discursive approach to understand how emotions are fabricated, intertwined with the mental constructs further expressed and communicated through the various units of language.Keywords: mental representation, emotion, language, psychology
Procedia PDF Downloads 2901098 Bottom-up Quantification of Mega Inter-Basin Water Transfer Vulnerability to Climate Change
Authors: Enze Zhang
Abstract:
Large numbers of inter-basin water transfer (IBWT) projects are constructed or proposed all around the world as solutions to water distribution and supply problems. Nowadays, as climate change warms the atmosphere, alters the hydrologic cycle, and perturbs water availability, large scale IBWTs which are sensitive to these water-related changes may carry significant risk. Given this reality, IBWTs have elicited great controversy and assessments of vulnerability to climate change are urgently needed worldwide. In this paper, we consider the South-to-North Water Transfer Project (SNWTP) in China as a case study, and introduce a bottom-up vulnerability assessment framework. Key hazards and risks related to climate change that threaten future water availability for the SNWTP are firstly identified. Then a performance indicator is presented to quantify the vulnerability of IBWT by taking three main elements (i.e., sensitivity, adaptive capacity, and exposure degree) into account. A probabilistic Budyko model is adapted to estimate water availability responses to a wide range of possibilities for future climate conditions in each region of the study area. After bottom-up quantifying the vulnerability based on the estimated water availability, our findings confirm that SNWTP would greatly alleviate geographical imbalances in water availability under some moderate climate change scenarios but raises questions about whether it is a long-term solution because the donor basin has a high level of vulnerability due to extreme climate change.Keywords: vulnerability, climate change, inter-basin water transfer, bottom-up
Procedia PDF Downloads 4001097 An Intelligent Traffic Management System Based on the WiFi and Bluetooth Sensing
Authors: Hamed Hossein Afshari, Shahrzad Jalali, Amir Hossein Ghods, Bijan Raahemi
Abstract:
This paper introduces an automated clustering solution that applies to WiFi/Bluetooth sensing data and is later used for traffic management applications. The paper initially summarizes a number of clustering approaches and thereafter shows their performance for noise removal. In this context, clustering is used to recognize WiFi and Bluetooth MAC addresses that belong to passengers traveling by a public urban transit bus. The main objective is to build an intelligent system that automatically filters out MAC addresses that belong to persons located outside the bus for different routes in the city of Ottawa. The proposed intelligent system alleviates the need for defining restrictive thresholds that however reduces the accuracy as well as the range of applicability of the solution for different routes. This paper moreover discusses the performance benefits of the presented clustering approaches in terms of the accuracy, time and space complexity, and the ease of use. Note that results of clustering can further be used for the purpose of the origin-destination estimation of individual passengers, predicting the traffic load, and intelligent management of urban bus schedules.Keywords: WiFi-Bluetooth sensing, cluster analysis, artificial intelligence, traffic management
Procedia PDF Downloads 2421096 The Classification of Parkinson Tremor and Essential Tremor Based on Frequency Alteration of Different Activities
Authors: Chusak Thanawattano, Roongroj Bhidayasiri
Abstract:
This paper proposes a novel feature set utilized for classifying the Parkinson tremor and essential tremor. Ten ET and ten PD subjects are asked to perform kinetic, postural and resting tests. The empirical mode decomposition (EMD) is used to decompose collected tremor signal to a set of intrinsic mode functions (IMF). The IMFs are used for reconstructing representative signals. The feature set is composed of peak frequencies of IMFs and reconstructed signals. Hypothesize that the dominant frequency components of subjects with PD and ET change in different directions for different tests, difference of peak frequencies of IMFs and reconstructed signals of pairwise based tests (kinetic-resting, kinetic-postural and postural-resting) are considered as potential features. Sets of features are used to train and test by classifier including the quadratic discriminant classifier (QLC) and the support vector machine (SVM). The best accuracy, the best sensitivity and the best specificity are 90%, 87.5%, and 92.86%, respectively.Keywords: tremor, Parkinson, essential tremor, empirical mode decomposition, quadratic discriminant, support vector machine, peak frequency, auto-regressive, spectrum estimation
Procedia PDF Downloads 4431095 The Resource Curse Hypothesis: Relevance to the Nigerian Economy
Authors: Modupeoluwa Solawon, Folusho Oluwole
Abstract:
The resource curse hypothesis is a widely discussed topic that suggests despite expectations of boosting economic development and improving the well-being of citizens, natural resource wealth in a country can lead to negative outcomes. The study focused on crude oil price, crude oil production, the pump price of petrol, agricultural production, and natural resources rent in Nigeria to determine the possible curse of these resources. The study also looked into the well-being of the citizens by employing gross domestic product per capita. The data used for the study were drawn from the World Bank Data Indicators in 2022, limited to annual data from 1981 to 2022, using the autoregressive distributed lag (ARDL) as the main estimation technique. The findings of the study revealed that natural resource rent influenced the GDP per capita detrimentally, indicating that natural resource rent has not led to better welfare for Nigerians. This effect could likely be a result of corruption in the system, causing the inability of the rents to promote better welfare in Nigeria. In conclusion, the study recommends reducing the cost of living in Nigeria and making productive use of revenues generated from its natural resources.Keywords: ARDL, corruption, natural resources, resource curse hypothesis
Procedia PDF Downloads 01094 Effect of Fractional Flow Curves on the Heavy Oil and Light Oil Recoveries in Petroleum Reservoirs
Authors: Abdul Jamil Nazari, Shigeo Honma
Abstract:
This paper evaluates and compares the effect of fractional flow curves on the heavy oil and light oil recoveries in a petroleum reservoir. Fingering of flowing water is one of the serious problems of the oil displacement by water and another problem is the estimation of the amount of recover oil from a petroleum reservoir. To address these problems, the fractional flow of heavy oil and light oil are investigated. The fractional flow approach treats the multi-phases flow rate as a total mixed fluid and then describes the individual phases as fractional of the total flow. Laboratory experiments are implemented for two different types of oils, heavy oil, and light oil, to experimentally obtain relative permeability and fractional flow curves. Application of the light oil fractional curve, which exhibits a regular S-shape, to the water flooding method showed that a large amount of mobile oil in the reservoir is displaced by water injection. In contrast, the fractional flow curve of heavy oil does not display an S-shape because of its high viscosity. Although the advance of the injected waterfront is faster than in light oil reservoirs, a significant amount of mobile oil remains behind the waterfront.Keywords: fractional flow, relative permeability, oil recovery, water fingering
Procedia PDF Downloads 3031093 Comparison of Irradiance Decomposition and Energy Production Methods in a Solar Photovoltaic System
Authors: Tisciane Perpetuo e Oliveira, Dante Inga Narvaez, Marcelo Gradella Villalva
Abstract:
Installations of solar photovoltaic systems have increased considerably in the last decade. Therefore, it has been noticed that monitoring of meteorological data (solar irradiance, air temperature, wind velocity, etc.) is important to predict the potential of a given geographical area in solar energy production. In this sense, the present work compares two computational tools that are capable of estimating the energy generation of a photovoltaic system through correlation analyzes of solar radiation data: PVsyst software and an algorithm based on the PVlib package implemented in MATLAB. In order to achieve the objective, it was necessary to obtain solar radiation data (measured and from a solarimetric database), analyze the decomposition of global solar irradiance in direct normal and horizontal diffuse components, as well as analyze the modeling of the devices of a photovoltaic system (solar modules and inverters) for energy production calculations. Simulated results were compared with experimental data in order to evaluate the performance of the studied methods. Errors in estimation of energy production were less than 30% for the MATLAB algorithm and less than 20% for the PVsyst software.Keywords: energy production, meteorological data, irradiance decomposition, solar photovoltaic system
Procedia PDF Downloads 1431092 Evaluation of Geomechanical and Geometrical Parameters’ Effects on Hydro-Mechanical Estimation of Water Inflow into Underground Excavations
Authors: M. Mazraehli, F. Mehrabani, S. Zare
Abstract:
In general, mechanical and hydraulic processes are not independent of each other in jointed rock masses. Therefore, the study on hydro-mechanical coupling of geomaterials should be a center of attention in rock mechanics. Rocks in their nature contain discontinuities whose presence extremely influences mechanical and hydraulic characteristics of the medium. Assuming this effect, experimental investigations on intact rock cannot help to identify jointed rock mass behavior. Hence, numerical methods are being used for this purpose. In this paper, water inflow into a tunnel under significant water table has been estimated using hydro-mechanical discrete element method (HM-DEM). Besides, effects of geomechanical and geometrical parameters including constitutive model, friction angle, joint spacing, dip of joint sets, and stress factor on the estimated inflow rate have been studied. Results demonstrate that inflow rates are not identical for different constitutive models. Also, inflow rate reduces with increased spacing and stress factor.Keywords: distinct element method, fluid flow, hydro-mechanical coupling, jointed rock mass, underground excavations
Procedia PDF Downloads 1661091 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion
Authors: Doyoung Kim, Hyo Seon Park
Abstract:
Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification
Procedia PDF Downloads 4101090 Artificial Neural Network Based Approach for Estimation of Individual Vehicle Speed under Mixed Traffic Condition
Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh
Abstract:
Developing speed model is a challenging task particularly under mixed traffic condition where the traffic composition plays a significant role in determining vehicular speed. The present research has been conducted to model individual vehicular speed in the context of mixed traffic on an urban arterial. Traffic speed and volume data have been collected from three midblock arterial road sections in New Delhi. Using the field data, a volume based speed prediction model has been developed adopting the methodology of Artificial Neural Network (ANN). The model developed in this work is capable of estimating speed for individual vehicle category. Validation results show a great deal of agreement between the observed speeds and the predicted values by the model developed. Also, it has been observed that the ANN based model performs better compared to other existing models in terms of accuracy. Finally, the sensitivity analysis has been performed utilizing the model in order to examine the effects of traffic volume and its composition on individual speeds.Keywords: speed model, artificial neural network, arterial, mixed traffic
Procedia PDF Downloads 3881089 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts
Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti
Abstract:
Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization
Procedia PDF Downloads 641088 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 1581087 Absorbed Dose Estimation of 68Ga-EDTMP in Human Organs
Authors: S. Zolghadri, H. Yousefnia, A. R. Jalilian
Abstract:
Bone metastases are observed in a wide range of cancers leading to intolerable pain. While early detection can help the physicians in the decision of the type of treatment, various radiopharmaceuticals using phosphonates like 68Ga-EDTMP have been developed. In this work, due to the importance of absorbed dose, human absorbed dose of this new agent was calculated for the first time based on biodistribution data in Wild-type rats. 68Ga was obtained from 68Ge/68Ga generator with radionuclidic purity and radiochemical purity of higher than 99%. The radiolabeled complex was prepared in the optimized conditions. Radiochemical purity of the radiolabeled complex was checked by instant thin layer chromatography (ITLC) method using Whatman No. 2 paper and saline. The results indicated the radiochemical purity of higher than 99%. The radiolabelled complex was injected into the Wild-type rats and its biodistribution was studied up to 120 min. As expected, major accumulation was observed in the bone. Absorbed dose of each human organ was calculated based on biodistribution in the rats using RADAR method. Bone surface and bone marrow with 0.112 and 0.053 mSv/MBq, respectively, received the highest absorbed dose. According to these results, the radiolabeled complex is a suitable and safe option for PET bone imaging.Keywords: absorbed dose, EDTMP, ⁶⁸Ga, rats
Procedia PDF Downloads 1951086 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment
Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai
Abstract:
Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.Keywords: computational methods, MATLAB, seismic hazard, seismic measurements
Procedia PDF Downloads 3411085 On Estimating the Low Income Proportion with Several Auxiliary Variables
Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández
Abstract:
Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.Keywords: inclusion probability, poverty, poverty line, survey sampling
Procedia PDF Downloads 4581084 Chemical Characterization and Antioxidant Capacity of Flour From Two Soya Bean Cultivars (Glycine Max)
Authors: Meziani Samira, Menadi Noreddine, Labga Lahouaria, Chenni Fatima Zohra, Toumi Asma
Abstract:
A comparative study between two varieties of soya beans was carried out in this work. The method consists of studying and proceeding to prepare a by-product (Flour) from two varieties of soybeans, a Chinese variety imported and marketed in Algeria. The chemical composition of ash, protein and fat was determined in this study. The minerals, namely potassium and sodium, were measured by flame spectrophotometer. In addition, the estimation of the polyphenol content and evaluation of the antioxidant activity Ferric Reducing Antioxidant Power assay (FRAP) f the methanol extracts of the flours were also carried out. The result revealed that soy flour from two cultivars, on average, contained 8% moisture, more than 50% protein, 1.58-1.87g fat, and 0.28-0.30g of ash. A slight difference was found for contents of 489 mg/ml of K + and 20 mg/ml of NA +. In addition, the phenolic content of the methanolic extracts gives a value of almost 37 mg EAG / g for both cultivars of soy flour. The estimated Reductive Antioxidant Iron (FRAP) potency of soy flour might be related to its polyphenol richness, which is similar to the variety of China. The flour Soya varieties tested contained a significant amount of protein and phenolic compounds with good antioxidant properties.Keywords: soye beans, soya flour, protein, total polyphenols
Procedia PDF Downloads 921083 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model
Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis
Abstract:
In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data
Procedia PDF Downloads 334