Search results for: random effect approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27644

Search results for: random effect approach

27494 Design of Microwave Building Block by Using Numerical Search Algorithm

Authors: Haifeng Zhou, Tsungyang Liow, Xiaoguang Tu, Eujin Lim, Chao Li, Junfeng Song, Xianshu Luo, Ying Huang, Lianxi Jia, Lianwee Luo, Qing Fang, Mingbin Yu, Guoqiang Lo

Abstract:

With the development of technology, countries gradually allocated more and more frequency spectrums for civilization and commercial usage, especially those high radio frequency bands indicating high information capacity. The field effect becomes more and more prominent in microwave components as frequency increases, which invalidates the transmission line theory and complicate the design of microwave components. Here a modeling approach based on numerical search algorithm is proposed to design various building blocks for microwave circuits to avoid complicated impedance matching and equivalent electrical circuit approximation. Concretely, a microwave component is discretized to a set of segments along the microwave propagation path. Each of the segment is initialized with random dimensions, which constructs a multiple-dimension parameter space. Then numerical searching algorithms (e.g. Pattern search algorithm) are used to find out the ideal geometrical parameters. The optimal parameter set is achieved by evaluating the fitness of S parameters after a number of iterations. We had adopted this approach in our current projects and designed many microwave components including sharp bends, T-branches, Y-branches, microstrip-to-stripline converters and etc. For example, a stripline 90° bend was designed in 2.54 mm x 2.54 mm space for dual-band operation (Ka band and Ku band) with < 0.18 dB insertion loss and < -55 dB reflection. We expect that this approach can enrich the tool kits for microwave designers.

Keywords: microwave component, microstrip and stripline, bend, power division, the numerical search algorithm.

Procedia PDF Downloads 358
27493 Mediating Effect of Hopefulness on the Effect of Underdog Narratives to Subjective Well-Being among Local State University of Cavite

Authors: Quiza Pearl Senilla, Hannah Mercado, Francis Angelo Erosa

Abstract:

Underdog narratives not only provides viewers with models of determination and hard work but that inducing hope may increase the likelihood that viewers will pursue their own goals in life. Although it has been proven that underdog narratives not only create a positive motivational state to the viewers but can also induce hope, little attention has been given to know if this underdog narrative affect the health outcomes or the subjective well-being of the viewers and if their hopefulness mediates on it. To address this gap, using underdog narratives as a predictor and hope as mediator, this study determined the effect of underdog narratives to the subjective well-being of the respondents, the relationship of hope and subjective well-being and last is the mediating effect of hopefulness. This study is an experimental research that uses a between subject design. Purposeful random sampling was used wherein the respondents must meet the following criteria to be part of the study. One hundred and twenty (N=120) Local State University students were assigned to different treatment conditions— underdog narrative, comedy, nature scenes—and a no exposure control group. Results show that there is a minimal difference on the subjective well-being of the respondents when exposed to different treatment condition although it is not significant. A moderate positive correlation between hope and subjective well-being also reveals in this study. And last the result also shows that there is no mediating effect of hopefulness to the subjective well-being of the subjects through exposure to underdog narrative.

Keywords: hope, hope theory, subjective well-being, underdog narratives

Procedia PDF Downloads 285
27492 Peeling Behavior of Thin Elastic Films Bonded to Rigid Substrate of Random Surface Topology

Authors: Ravinu Garg, Naresh V. Datla

Abstract:

We study the fracture mechanics of peeling of thin films perfectly bonded to a rigid substrate of any random surface topology using an analytical formulation. A generalized theoretical model has been developed to determine the peel strength of thin elastic films. It is demonstrated that an improvement in the peel strength can be achieved by modifying the surface characteristics of the rigid substrate. Characterization study has been performed to analyze the effect of different parameters on effective peel force from the rigid surface. Different surface profiles such as circular and sinusoidal has been considered to demonstrate the bonding characteristics of film-substrate interface. Condition for the instability in the debonding of the film is analyzed, where the localized self-debonding arises depending upon the film and surface characteristics. This study is towards improved adhesion strength of thin films to rigid substrate using different textured surfaces.

Keywords: debonding, fracture mechanics, peel test, thin film adhesion

Procedia PDF Downloads 424
27491 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp

Procedia PDF Downloads 327
27490 Random Access in IoT Using Naïve Bayes Classification

Authors: Alhusein Almahjoub, Dongyu Qiu

Abstract:

This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.

Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation

Procedia PDF Downloads 126
27489 Improving Search Engine Performance by Removing Indexes to Malicious URLs

Authors: Durga Toshniwal, Lokesh Agrawal

Abstract:

As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.

Keywords: web crawler, malwares, seeds, drive-by-downloads, security

Procedia PDF Downloads 216
27488 Random Walks and Option Pricing for European and American Options

Authors: Guillaume Leduc

Abstract:

In this paper, we describe a broad setting under which the error of the approximation can be quantified, controlled, and for which convergence occurs at a speed of n⁻¹ for European and American options. We describe how knowledge of the error allows for arbitrarily fast acceleration of the convergence.

Keywords: random walk approximation, European and American options, rate of convergence, option pricing

Procedia PDF Downloads 434
27487 Nonlinear Vibration of FGM Plates Subjected to Acoustic Load in Thermal Environment Using Finite Element Modal Reduction Method

Authors: Hassan Parandvar, Mehrdad Farid

Abstract:

In this paper, a finite element modeling is presented for large amplitude vibration of functionally graded material (FGM) plates subjected to combined random pressure and thermal load. The material properties of the plates are assumed to vary continuously in the thickness direction by a simple power law distribution in terms of the volume fractions of the constituents. The material properties depend on the temperature whose distribution along the thickness can be expressed explicitly. The von Karman large deflection strain displacement and extended Hamilton's principle are used to obtain the governing system of equations of motion in structural node degrees of freedom (DOF) using finite element method. Three-node triangular Mindlin plate element with shear correction factor is used. The nonlinear equations of motion in structural degrees of freedom are reduced by using modal reduction method. The reduced equations of motion are solved numerically by 4th order Runge-Kutta scheme. In this study, the random pressure is generated using Monte Carlo method. The modeling is verified and the nonlinear dynamic response of FGM plates is studied for various values of volume fraction and sound pressure level under different thermal loads. Snap-through type behavior of FGM plates is studied too.

Keywords: nonlinear vibration, finite element method, functionally graded material (FGM) plates, snap-through, random vibration, thermal effect

Procedia PDF Downloads 244
27486 The Impact of Inpatient New Boarding Policy on Emergency Department Overcrowding: A Discrete Event Simulation Study

Authors: Wheyming Tina Song, Chi-Hao Hong

Abstract:

In this study, we investigate the effect of a new boarding policy - short stay, on the overcrowding efficiency in emergency department (ED). The decision variables are no. of short stay beds for least acuity ED patients. The performance measurements used are national emergency department overcrowding score (NEDOCS) and ED retention rate (the percentage that patients stay in ED over than 48 hours in one month). Discrete event simulation (DES) is used as an analysis tool to evaluate the strategy. Also, common random number (CRN) technique is applied to enhance the simulation precision. The DES model was based on a census of 6 months' patients who were treated in the ED of the National Taiwan University Hospital Yunlin Branch. Our results show that the new short-stay boarding significantly impacts both the NEDOCS and ED retention rate when the no. of short stay beds is more than three.

Keywords: emergency department (ED), common random number (CRN), national emergency department overcrowding score (NEDOCS), discrete event simulation (DES)

Procedia PDF Downloads 328
27485 Debt Relief for Emerging Economies: An Empirical Investigation

Authors: Hummad Ch. Umar

Abstract:

Most of the developing economies, including Pakistan, are confronted with high level of external debt which is adversely affecting their economic performance. The hypothesis of debt overhang is often used to assess the negative relationship between foreign debt and the economic growth of the indebted country. As first objective of the present study, this hypothesis is tested by using Pooled OLS (POLS), Generalized Method of Moment (GMM), Random Effect (RE), and Fixed effect (FE) techniques. As second objective, the study uses the concept of debt Laffer Curve to determine the eligibility condition of the indebted countries for the relief programs. According to this approach, countries lying on the right side of the Laffer Curve are stated to be trapped in the strong debt overhang making them unable to come out of the vicious circle of low growth and high foreign debt. The empirical analysis confirms that only two countries out of twenty two completely fulfill the conditions of being eligible for the debt relief. All other countries continue to face debt burden of different magnitudes. The study further confirms that the debt relief alone is not sufficient for overcoming the debt problem. Instead, sound economic policies and conducive investment decisions are required to lay the foundations of long-term growth and development. Debt relief should be the option for only those countries that meet a minimum measurable criterion of good governance, economic freedom, and consistency of policies.

Keywords: external debt, debt burden, debt overhang, debt laffer curve, debt relief, investment decisions

Procedia PDF Downloads 308
27484 Internal Migration and Poverty Dynamic Analysis Using a Bayesian Approach: The Tunisian Case

Authors: Amal Jmaii, Damien Rousseliere, Besma Belhadj

Abstract:

We explore the relationship between internal migration and poverty in Tunisia. We present a methodology combining potential outcomes approach with multiple imputation to highlight the effect of internal migration on poverty states. We find that probability of being poor decreases when leaving the poorest regions (the west areas) to the richer regions (greater Tunis and the east regions).

Keywords: internal migration, potential outcomes approach, poverty dynamics, Tunisia

Procedia PDF Downloads 286
27483 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images

Authors: Mehrnoosh Omati, Mahmod Reza Sahebi

Abstract:

The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.

Keywords: coupled Markov random field (MRF), environment, object-based analysis, polarimetric SAR (PolSAR) images

Procedia PDF Downloads 198
27482 Bank Internal Controls and Credit Risk in Europe: A Quantitative Measurement Approach

Authors: Ellis Kofi Akwaa-Sekyi, Jordi Moreno Gené

Abstract:

Managerial actions which negatively profile banks and impair corporate reputation are addressed through effective internal control systems. Disregard for acceptable standards and procedures for granting credit have affected bank loan portfolios and could be cited for the crises in some European countries. The study intends to determine the effectiveness of internal control systems, investigate whether perceived agency problems exist on the part of board members and to establish the relationship between internal controls and credit risk among listed banks in the European Union. Drawing theoretical support from the behavioural compliance and agency theories, about seventeen internal control variables (drawn from the revised COSO framework), bank-specific, country, stock market and macro-economic variables will be involved in the study. A purely quantitative approach will be employed to model internal control variables covering the control environment, risk management, control activities, information and communication and monitoring. Panel data from 2005-2014 on listed banks from 28 European Union countries will be used for the study. Hypotheses will be tested and the Generalized Least Squares (GLS) regression will be run to establish the relationship between dependent and independent variables. The Hausman test will be used to select whether random or fixed effect model will be used. It is expected that listed banks will have sound internal control systems but their effectiveness cannot be confirmed. A perceived agency problem on the part of the board of directors is expected to be confirmed. The study expects significant effect of internal controls on credit risk. The study will uncover another perspective of internal controls as not only an operational risk issue but credit risk too. Banks will be cautious that observing effective internal control systems is an ethical and socially responsible act since the collapse (crisis) of financial institutions as a result of excessive default is a major contagion. This study deviates from the usual primary data approach to measuring internal control variables and rather models internal control variables in a quantitative approach for the panel data. Thus a grey area in approaching the revised COSO framework for internal controls is opened for further research. Most bank failures and crises could be averted if effective internal control systems are religiously adhered to.

Keywords: agency theory, credit risk, internal controls, revised COSO framework

Procedia PDF Downloads 287
27481 Global Direct Search Optimization of a Tuned Liquid Column Damper Subject to Stochastic Load

Authors: Mansour H. Alkmim, Adriano T. Fabro, Marcus V. G. De Morais

Abstract:

In this paper, a global direct search optimization algorithm to reduce vibration of a tuned liquid column damper (TLCD), a class of passive structural control device, is presented. The objective is to find optimized parameters for the TLCD under stochastic load from different wind power spectral density. A verification is made considering the analytical solution of an undamped primary system under white noise excitation. Finally, a numerical example considering a simplified wind turbine model is given to illustrate the efficacy of the TLCD. Results from the random vibration analysis are shown for four types of random excitation wind model where the response PSDs obtained showed good vibration attenuation.

Keywords: generalized pattern search, parameter optimization, random vibration analysis, vibration suppression

Procedia PDF Downloads 250
27480 Assessment of Pastoralist-Crop Farmers Conflict and Food Security of Farming Households in Kwara State, Nigeria

Authors: S. A. Salau, I. F. Ayanda, I. Afe, M. O. Adesina, N. B. Nofiu

Abstract:

Food insecurity is still a critical challenge among rural and urban households in Nigeria. The country’s food insecurity situation became more pronounced due to frequent conflict between pastoralist and crop farmers. Thus, this study assesses pastoralist-crop farmers’ conflict and food security of farming households in Kwara state, Nigeria. The specific objectives are to measure the food security status of the respondents, quantify pastoralist- crop farmers’ conflict, determine the effect of pastoralist- crop farmers conflict on food security and describe the effective coping strategies adopted by the respondents to reduce the effect of food insecurity. A combination of purposive and simple random sampling techniques will be used to select 250 farming households for the study. The analytical tools include descriptive statistics, Likert-scale, logistic regression, and food security index. Using the food security index approach, the percentage of households that were food secure and insecure will be known. Pastoralist- crop farmers’ conflict will be measured empirically by quantifying loses due to the conflict. The logistic regression will indicate if pastoralist- crop farmers’ conflict is a critical determinant of food security among farming households in the study area. The coping strategies employed by the respondents in cushioning the effects of food insecurity will also be revealed. Empirical studies on the effect of pastoralist- crop farmers’ conflict on food security are rare in the literature. This study will quantify conflict and reveal the direction as well as the extent of the relationship between conflict and food security. It could contribute to the identification and formulation of strategies for the minimization of conflict among pastoralist and crop farmers in an attempt to reduce food insecurity. Moreover, this study could serve as valuable reference material for future researches and open up new areas for further researches.

Keywords: agriculture, conflict, coping strategies, food security, logistic regression

Procedia PDF Downloads 153
27479 Efficient Internal Generator Based on Random Selection of an Elliptic Curve

Authors: Mustapha Benssalah, Mustapha Djeddou, Karim Drouiche

Abstract:

The random number generation (RNG) presents a significant importance for the security and the privacy of numerous applications, such as RFID technology and smart cards. Since, the quality of the generated bit sequences is paramount that a weak internal generator for example, can directly cause the entire application to be insecure, and thus it makes no sense to employ strong algorithms for the application. In this paper, we propose a new pseudo random number generator (PRNG), suitable for cryptosystems ECC-based, constructed by randomly selecting points from several elliptic curves randomly selected. The main contribution of this work is the increasing of the generator internal states by extending the set of its output realizations to several curves auto-selected. The quality and the statistical characteristics of the proposed PRNG are validated using the Chi-square goodness of fit test and the empirical Special Publication 800-22 statistical test suite issued by NIST.

Keywords: PRNG, security, cryptosystem, ECC

Procedia PDF Downloads 423
27478 Reliability Analysis for Cyclic Fatigue Life Prediction in Railroad Bolt Hole

Authors: Hasan Keshavarzian, Tayebeh Nesari

Abstract:

Bolted rail joint is one of the most vulnerable areas in railway track. A comprehensive approach was developed for studying the reliability of fatigue crack initiation of railroad bolt hole under random axle loads and random material properties. The operation condition was also considered as stochastic variables. In order to obtain the comprehensive probability model of fatigue crack initiation life prediction in railroad bolt hole, we used FEM, response surface method (RSM), and reliability analysis. Combined energy-density based and critical plane based fatigue concept is used for the fatigue crack prediction. The dynamic loads were calculated according to the axle load, speed, and track properties. The results show that axle load is most sensitive parameter compared to Poisson’s ratio in fatigue crack initiation life. Also, the reliability index decreases slowly due to high cycle fatigue regime in this area.

Keywords: rail-wheel tribology, rolling contact mechanic, finite element modeling, reliability analysis

Procedia PDF Downloads 367
27477 A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm

Authors: Ali Nourollah, Mohsen Movahedinejad

Abstract:

In this paper a new algorithm to generate random simple polygons from a given set of points in a two dimensional plane is designed. The proposed algorithm uses a genetic algorithm to generate polygons with few vertices. A new merge algorithm is presented which converts any two polygons into a simple polygon. This algorithm at first changes two polygons into a polygonal chain and then the polygonal chain is converted into a simple polygon. The process of converting a polygonal chain into a simple polygon is based on the removal of intersecting edges. The merge algorithm has the time complexity of O ((r+s) *l) where r and s are the size of merging polygons and l shows the number of intersecting edges removed from the polygonal chain. It will be shown that 1 < l < r+s. The experiments results show that the proposed algorithm has the ability to generate a great number of different simple polygons and has better performance in comparison to celebrated algorithms such as space partitioning and steady growth.

Keywords: Divide and conquer, genetic algorithm, merge polygons, Random simple polygon generation.

Procedia PDF Downloads 511
27476 The Effect of Sensory Integration in Reduction of Stereotype Behaviour in Autistic Children

Authors: Mohammad Khamoushi, Reza Mirmahdi

Abstract:

The aim of this research was the effect of sensory integration in reduction of stereotype behaviors in autistic children. The statistical population included 55 children with the age range 2/8 – 14 in Esfahan Ordibehesht autistic center. Purposive sampling was used for selecting the sample group and 20 children with random assignment were designated in two group; experimental and control . Research project was quasi-experimental two-group with pretest and posttest. Data collection tools included repetitive behavior scale-revised with six sub-scales: stereotype behavior, self-injurious behavior, compulsive behavior, ritualistic behavior, sameness behavior, restricted behavior. Analysis of covariance was used for analyzing hypotheses. Result show that sensory integration procedure was effective in reduction of stereotype behavior, compulsive behavior and self-injurious behavior in autistic children. According to the findings, it is suggested that effect sensory integration procedure in stereotype behavior of autism children should be studied and used for treatment of other disabilities of this children.

Keywords: autism, sensory integration procedure, stereotype behavior, compulsive behavior

Procedia PDF Downloads 551
27475 Modeling of Thermo Acoustic Emission Memory Effect in Rocks of Varying Textures

Authors: Vladimir Vinnikov

Abstract:

The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied with a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.

Keywords: crack growth, cyclic heating and cooling, rock texture, thermo acoustic emission memory effect

Procedia PDF Downloads 255
27474 Estimation of Probabilistic Fatigue Crack Propagation Models of AZ31 Magnesium Alloys under Various Load Ratio Conditions by Using the Interpolation of a Random Variable

Authors: Seon Soon Choi

Abstract:

The essential purpose is to present the good fatigue crack propagation model describing a stochastic fatigue crack growth behavior in a rolled magnesium alloy, AZ31, under various load ratio conditions. Fatigue crack propagation experiments were carried out in laboratory air under four conditions of load ratio, R, using AZ31 to investigate the crack growth behavior. The stochastic fatigue crack growth behavior was analyzed using an interpolation of random variable, Z, introduced to an empirical fatigue crack propagation model. The empirical fatigue models used in this study are Paris-Erdogan model, Walker model, Forman model, and modified Forman model. It was found that the random variable is useful in describing the stochastic fatigue crack growth behaviors under various load ratio conditions. The good probabilistic model describing a stochastic fatigue crack growth behavior under various load ratio conditions was also proposed.

Keywords: magnesium alloys, fatigue crack propagation model, load ratio, interpolation of random variable

Procedia PDF Downloads 394
27473 Catastrophic Burden and Impoverishment Effect of WASH Diseases: A Ground Analysis of Bhadohi District Uttar Pradesh, India

Authors: Jyoti Pandey, Rajiv Kumar Bhatt

Abstract:

In the absence of proper sanitation, people suffered from high levels of infectious diseases leading to high incidences of morbidity and mortality. This directly affected the ability of a country to maintain an efficient economy and implied great personal suffering among infected individuals and their families. This paper aims to estimate the catastrophic expenditure of households in terms of direct and indirect losses which a person has to face due to the illness of WASH diseases; the severity of the scenario is answered by finding out the impoverishment effect. We used the primary data survey for the objective outlined. Descriptive and analytical research types are used. The survey is done with the questionnaire formulated precisely, taking care of the inclusion of all the variables and probable outcomes. A total of 300 households is covered under this study. In order to pursue the objectives outlined, multistage random sampling of households is used. In this study, the cost of illness approach is followed for accessing economic impact. The study brought out the attention that a significant portion of the total consumption expenditure is going lost for the treatment of water and sanitation related diseases. The infectious and water vector-borne disease can be checked by providing sufficient required sanitation facility, and that 2.02% loss in income can be gained if the mechanisms of the pathogen is checked.

Keywords: water, sanitation, impoverishment, catastrophic expenditure

Procedia PDF Downloads 62
27472 The Effect of Expressive Therapies on Children and Youth Impacted by Refugee Trauma: A Meta-Analysis

Authors: Brian Kristopher Cambra

Abstract:

Millions of displaced families are seeking refuge in countries that are not their own due to war, violence, persecution, political unrest, and natural disasters. This global crisis is forcing researchers and practitioners to consider how refugees are coping with the trauma associated with their migration process. Effective therapeutic approaches are needed in a global effort to address the traumatic impact of forced migration. This meta-analytical study investigates the effectiveness of expressive therapeutic modalities, including play, art, music, sandplay, theatre, and writing therapies, in helping children and adolescents cope with refugee trauma. Seventeen pre-post and between-group comparison studies were analyzed using a random-effects model. The combined effect size for pre-post comparisons was medium (g = 0.58), whereas the combined effect size for between-group comparisons was small (g = 0.32). Overall, art therapy was found to be most effective in treating stress symptoms. Heterogeneity tests, however, suggest effect sizes cannot be interpreted as meaningful due to substantial variance. Nevertheless, findings of this meta-analysis indicate that expressive therapies may be among beneficial modalities to integrate with other trauma-informed approaches.

Keywords: expressive therapies, forced migration, meta-analysis, refugees, trauma

Procedia PDF Downloads 124
27471 A New Approach to Increase Consumer Understanding of Meal’s Quality – Food Focus Instead of Nutrient Focus

Authors: Elsa Lamy, Marília Prada, Ada Rocha, Cláudia Viegas

Abstract:

The traditional and widely used nutrition-focused approach to communicate with consumers is reductionist and makes it difficult for consumers to assess their food intake. Without sufficient nutrition knowledge and understanding, it would be difficult to choose a healthful diet based only on nutritional recommendations. This study aimed to evaluate the understanding of how food/nutritional information is presented in menus to Portuguese consumers, comparing the nutrient-focused approach (currently used Nutrition Declaration) and the new food-focused approach (the infographic). For data collection, a questionnaire was distributed online using social media channels. A main effect of format on ratings of meal balance and completeness (Fbalance(1,79) = 18.26, p < .001, ηp2 = .188; Fcompleteness(1,67) = 27.18, p < .001, ηp2 = .289). Overall, dishes paired with the nutritional information were rated as more balanced (Mbalance= 3.70, SE = .11; Mcompleteness = 4.00, SE = .14) than meals with the infographic representation (Mbalance = 3.14, SE = .11; Mcompleteness = 3.29, SE = .13). We also observed a main effect of the meal, F(3,237) = 48.90, p < .001, ηp2 = .382, such that M1 and M2 were perceived as less balanced than the M3 and M4, all p < .001. The use of a food-focused approach (infographic) helped participants identify the lack of balance in the less healthful meals (dishes M1 and M2), allowing for a better understanding of meals' compliance with recommendations contributing to better food choices and a healthier lifestyle.

Keywords: food labelling, food and nutritional recommendations, infographics, portions based information

Procedia PDF Downloads 58
27470 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System

Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko

Abstract:

Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.

Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic

Procedia PDF Downloads 33
27469 Optimizing Load Shedding Schedule Problem Based on Harmony Search

Authors: Almahd Alshereef, Ahmed Alkilany, Hammad Said, Azuraliza Abu Bakar

Abstract:

From time to time, electrical power grid is directed by the National Electricity Operator to conduct load shedding, which involves hours' power outages on the area of this study, Southern Electrical Grid of Libya (SEGL). Load shedding is conducted in order to alleviate pressure on the National Electricity Grid at times of peak demand. This approach has chosen a set of categories to study load-shedding problem considering the effect of the demand priorities on the operation of the power system during emergencies. Classification of category region for load shedding problem is solved by a new algorithm (the harmony algorithm) based on the "random generation list of category region", which is a possible solution with a proximity degree to the optimum. The obtained results prove additional enhancements compared to other heuristic approaches. The case studies are carried out on SEGL.

Keywords: optimization, harmony algorithm, load shedding, classification

Procedia PDF Downloads 367
27468 Feature Extraction Technique for Prediction the Antigenic Variants of the Influenza Virus

Authors: Majid Forghani, Michael Khachay

Abstract:

In genetics, the impact of neighboring amino acids on a target site is referred as the nearest-neighbor effect or simply neighbor effect. In this paper, a new method called wavelet particle decomposition representing the one-dimensional neighbor effect using wavelet packet decomposition is proposed. The main idea lies in known dependence of wavelet packet sub-bands on location and order of neighboring samples. The method decomposes the value of a signal sample into small values called particles that represent a part of the neighbor effect information. The results have shown that the information obtained from the particle decomposition can be used to create better model variables or features. As an example, the approach has been applied to improve the correlation of test and reference sequence distance with titer in the hemagglutination inhibition assay.

Keywords: antigenic variants, neighbor effect, wavelet packet, wavelet particle decomposition

Procedia PDF Downloads 134
27467 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder

Authors: Dua Hişam, Serhat İkizoğlu

Abstract:

Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.

Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting

Procedia PDF Downloads 48
27466 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 216
27465 The Relationship among Personality, Culture Personality and Ideal Tourist/Business Destinations

Authors: Tamás Gyulavári, Erzsébet Malota

Abstract:

The main purpose of our study was to investigate the effect of congruence between the perceived self and perceived culture personality on the evaluation of the examined countries as ideal business/tourist destinations. A measure of Culture Personality (CP) has been developed and implemented to assess the perception of French and Turkish culture. Results show that very similar personality structure of both cultures can be extracted along the dimensions of Competence, Interpersonal approach, Aura, Life approach and Rectitude. Regarding the congruence theory, we found that instead of the effect of similarity between the perceived culture personality and actual self, the more positively culture personality is perceived relative to the perceived self, the more positive attitude the individual has toward the country as business and tourist destination.

Keywords: culture personality, ideal business/tourist destination, personality, scale development

Procedia PDF Downloads 381