Search results for: subjective probability
1538 The Subjective Experiences of First-Time Chinese Parents' Transition to Parenthood and the Impact on Their Marital Satisfaction
Authors: Amy Yee Kai Wan
Abstract:
The arrival of a new baby to first-time parents is an exciting and joyous occasion, yet, the daunting task of raising the baby and the uncertainty of how it will affect the lives of the couple present a great challenge to them. This study examines the causes of conflicts and needs of the new parents through a qualitative research of five pairs of new parents in Hong Kong. Semi-structured in-depth qualitative interviews were conducted to explore the changes babies brought to their marriages, sources of support they received and found important and assistance they felt would help with their transition to parenthood. Thematic analysis was used to analyze the commonalities and differences between the five couples’ subjective experiences. Narrative analysis was used to compare the experiences of two parents who are the under-functioning parent of the couple, to study the different strategies they employed in response to the over-functioning parent and to analyze how the marital relationships were affected. Four main themes emerged from the study: 1) Change and adjustment in marital relationship, 2) parents’ level of involvement, 3) support in childcaring, and 4) challenges faced by the parents. Results from the study indicated that father involvement in childcaring is an important element in mother’s marital satisfaction Father’s marital satisfaction is dependent upon the mother – her satisfaction with father involvement, which affects the mother’s marital satisfaction. Marital convergence and co-parenting alliance acted as moderators for marital satisfaction. Implications from the study include: i) offering programmes that improve couple relationship and enhance parenting efficacy in tandem to improve overall marital satisfaction, and ii) offering prenatal counselling services or provide education to new parents from prenatal to postnatal period that can help couples reduce discrepancies between expectations and realities of their marital relationship and parenting responsibilities after their baby is born.Keywords: co-parenting alliance, father involvement, marital convergence, maternal gatekeeping, new parents, transition to parenthood
Procedia PDF Downloads 1511537 Quality of Service of Transportation Networks: A Hybrid Measurement of Travel Time and Reliability
Authors: Chin-Chia Jane
Abstract:
In a transportation network, travel time refers to the transmission time from source node to destination node, whereas reliability refers to the probability of a successful connection from source node to destination node. With an increasing emphasis on quality of service (QoS), both performance indexes are significant in the design and analysis of transportation systems. In this work, we extend the well-known flow network model for transportation networks so that travel time and reliability are integrated into the QoS measurement simultaneously. In the extended model, in addition to the general arc capacities, each intermediate node has a time weight which is the travel time for per unit of commodity going through the node. Meanwhile, arcs and nodes are treated as binary random variables that switch between operation and failure with associated probabilities. For pre-specified travel time limitation and demand requirement, the QoS of a transportation network is the probability that source can successfully transport the demand requirement to destination while the total transmission time is under the travel time limitation. This work is pioneering, since existing literatures that evaluate travel time reliability via a single optimization path, the proposed QoS focuses the performance of the whole network system. To compute the QoS of transportation networks, we first transfer the extended network model into an equivalent min-cost max-flow network model. In the transferred network, each arc has a new travel time weight which takes value 0. Each intermediate node is replaced by two nodes u and v, and an arc directed from u to v. The newly generated nodes u and v are perfect nodes. The new direct arc has three weights: travel time, capacity, and operation probability. Then the universal set of state vectors is recursively decomposed into disjoint subsets of reliable, unreliable, and stochastic vectors until no stochastic vector is left. The decomposition is made possible by applying existing efficient min-cost max-flow algorithm. Because the reliable subsets are disjoint, QoS can be obtained directly by summing the probabilities of these reliable subsets. Computational experiments are conducted on a benchmark network which has 11 nodes and 21 arcs. Five travel time limitations and five demand requirements are set to compute the QoS value. To make a comparison, we test the exhaustive complete enumeration method. Computational results reveal the proposed algorithm is much more efficient than the complete enumeration method. In this work, a transportation network is analyzed by an extended flow network model where each arc has a fixed capacity, each intermediate node has a time weight, and both arcs and nodes are independent binary random variables. The quality of service of the transportation network is an integration of customer demands, travel time, and the probability of connection. We present a decomposition algorithm to compute the QoS efficiently. Computational experiments conducted on a prototype network show that the proposed algorithm is superior to existing complete enumeration methods.Keywords: quality of service, reliability, transportation network, travel time
Procedia PDF Downloads 2211536 Exploring Military Crime in the Australian Imperial Force by Officers During The First World War
Authors: Des Lambley
Abstract:
The scope and scale of crime in the AIF is a subject largely overlooked by historians preferring to narrate the macro-scale topics. This examination exposes some 17,000 military criminals, 414 of them officers and illustrates how military law imposed itself. This subjective sociological perspective humanises the impacts of war upon soldiers. Examples of the crimes, their seriousness, punishments and military justice tell of cause and effect linkages between crime, stress and illness. The discourse is derived from original official military sources in the Australian Archives.Keywords: Australia, AIF, Military Crime, WW1, Officers
Procedia PDF Downloads 1311535 Gender Differences in the Impact and Subjective Interpretation of Childhood Sexual Abuse Survivors
Authors: T. Borja-Alvarez, V. Jiménez-Borja, M. Jiménez Borja, C. J. Jiménez-Mosquera
Abstract:
Research on child sexual abuse has predominantly focused on female survivors. This has resulted in less research looking at the particular context in which this abuse takes place for boys and the impact this abuse may have on male survivors. The aim of this study is to examine the sex and age of the perpetrators of child sexual abuse and explore gender differences in the impact along with the subjective interpretation that survivors attribute to these experiences. The data for this study was obtained from Ecuadorian university students (M = 230, F = 293) who reported sexual abuse using the ISPCAN Child Abuse Screening Tool Retrospective version (ICAST-R). Participants completed Horowitz's Impact of Event Scale (IES) and were also requested to choose among neutral, positive, and negative adjectives to describe these experiences. The results indicate that in the case of males, perpetrators were both males (adults =27%, peers =20%, relatives =10.3%, cousins =7.4%) and young females (girlfriends or ex-girlfriends =25.6%, neighborhood =20.7%, school =16.7%, cousins =15.3%, strangers =12.8%). In contrast, almost all females reported that adult males were the perpetrators (relatives =29.6%, neighborhood =11.9%, strangers =19.9%, family friends =9.7%). Regarding the impact of these events, significant gender differences emerged. More females (50%) than males (20%) presented symptoms of post-traumatic stress disorder (PTSD). Gender differences also surfaced in the way survivors interpret their experiences. Almost half of the male participants selected the word “consensual” followed by the words “normal”, “helped me to mature”, “shameful”, “confusing”, and “traumatic”. In contrast, almost all females chose the word “non-consensual” followed by the words “shameful”, “traumatic”, “scary”, and “confusing”. In conclusion, the findings of this study suggest that young females and adult males were the most common perpetrators of sexually abused boys whereas adult males were the most common perpetrators of sexually abused girls. The impact and the subjective interpretation of these experiences were more negative for girls than for boys. The factors that account for the gender differences in the impact and the interpretation of these experiences need further exploration. It is likely that the cultural expectations of sexual behaviors for boys and girls in Latin American societies may partially explain the differential impact in the way these childhood sexual abuse experiences are interpreted in adulthood. In Ecuador, as is the case in other Latin American countries, the machismo culture not only accepts but encourages early sexual behaviors in boys and negatively judges premature sexual behavior in females. The result of these different sexual expectations may be that sexually abused boys may re-define these experiences as “consensual” and “normal” in adulthood, even though these were not consensual at the time of occurrence. Future studies are needed to more deeply understand the different contexts of sexual abuse for boys and girls in order to analyze the long-term impact of these experiences.Keywords: abuse, child, gender differences, sexual
Procedia PDF Downloads 1041534 Statistical Analysis of Extreme Flow (Regions of Chlef)
Authors: Bouthiba Amina
Abstract:
The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.Keywords: return period, extreme flow, statistics laws, Gumbel, estimation
Procedia PDF Downloads 781533 Performance Evaluation of a Prioritized, Limited Multi-Server Processor-Sharing System that Includes Servers with Various Capacities
Authors: Yoshiaki Shikata, Nobutane Hanayama
Abstract:
We present a prioritized, limited multi-server processor sharing (PS) system where each server has various capacities, and N (≥2) priority classes are allowed in each PS server. In each prioritized, limited server, different service ratio is assigned to each class request, and the number of requests to be processed is limited to less than a certain number. Routing strategies of such prioritized, limited multi-server PS systems that take into account the capacity of each server are also presented, and a performance evaluation procedure for these strategies is discussed. Practical performance measures of these strategies, such as loss probability, mean waiting time, and mean sojourn time, are evaluated via simulation. In the PS server, at the arrival (or departure) of a request, the extension (shortening) of the remaining sojourn time of each request receiving service can be calculated by using the number of requests of each class and the priority ratio. Utilising a simulation program which executes these events and calculations, the performance of the proposed prioritized, limited multi-server PS rule can be analyzed. From the evaluation results, most suitable routing strategy for the loss or waiting system is clarified.Keywords: processor sharing, multi-server, various capacity, N-priority classes, routing strategy, loss probability, mean sojourn time, mean waiting time, simulation
Procedia PDF Downloads 3311532 Building an Arithmetic Model to Assess Visual Consistency in Townscape
Authors: Dheyaa Hussein, Peter Armstrong
Abstract:
The phenomenon of visual disorder is prominent in contemporary townscapes. This paper provides a theoretical framework for the assessment of visual consistency in townscape in order to achieve more favourable outcomes for users. In this paper, visual consistency refers to the amount of similarity between adjacent components of townscape. The paper investigates parameters which relate to visual consistency in townscape, explores the relationships between them and highlights their significance. The paper uses arithmetic methods from outside the domain of urban design to enable the establishment of an objective approach of assessment which considers subjective indicators including users’ preferences. These methods involve the standard of deviation, colour distance and the distance between points. The paper identifies urban space as a key representative of the visual parameters of townscape. It focuses on its two components, geometry and colour in the evaluation of the visual consistency of townscape. Accordingly, this article proposes four measurements. The first quantifies the number of vertices, which are points in the three-dimensional space that are connected, by lines, to represent the appearance of elements. The second evaluates the visual surroundings of urban space through assessing the location of their vertices. The last two measurements calculate the visual similarity in both vertices and colour in townscape by the calculation of their variation using methods including standard of deviation and colour difference. The proposed quantitative assessment is based on users’ preferences towards these measurements. The paper offers a theoretical basis for a practical tool which can alter the current understanding of architectural form and its application in urban space. This tool is currently under development. The proposed method underpins expert subjective assessment and permits the establishment of a unified framework which adds to creativity by the achievement of a higher level of consistency and satisfaction among the citizens of evolving townscapes.Keywords: townscape, urban design, visual assessment, visual consistency
Procedia PDF Downloads 3131531 In Exile but Not at Peace: An Ethnography among Rwandan Army Deserters in South Africa
Authors: Florence Ncube
Abstract:
This paper examines the military and post-military experiences of soldiers who deserted from the Rwanda Defence Force (RDF) and tried to make a living in South Africa. Because they are deserters, they try to hide their military identity, yet it is simultaneously somewhat coercively ascribed to them by the Rwandan state and can put them in potential danger. The paper attends to the constructions, experiences, practices, and subjective understanding of the deserters’ being in exile to examine how, under circumstances of perceived threat, these men navigate real or perceived state-sponsored surveillance and threat in non-military settings in South Africa where they have become potential political and disciplinary targets. To make sense of the deserters’ experiences in these circumstances, the paper stitches together a number of useful theoretical concepts, including Bourdieu’s (1992) theory of practice and Vigh’s (2009; 2018) concept of social navigation because no single approach can coherently analyze the specificity of this study. Conventional post-military literature privileges an understanding of army desertion as a malignancy and somewhat problematic. Little is known about the military and post-military experiences of deserters who believe that army desertion is in fact a building block towards achieving subjective peace, even in the context of exile. The paper argues that the presence of Rwandan state agents in South Africa strips the context of the exile of its capacity to provide the deserters with peace, safety, and security. This paper recenters army desertion in analyses of militarism, soldiering, and transition in African contexts and complicates commonsense understandings of army desertion which assume that it is entirely problematic. This paper is drawn from an ethnography conducted among 30 junior-rank Rwandan army deserters exiled in Johannesburg and Cape Town. The researcher employed life histories, in-depth interviews, and deep hangouts to collect data.Keywords: army deserter, military, identity, exile, peacebuilding, South Africa
Procedia PDF Downloads 691530 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment
Authors: Isabela Moreira Queiroz
Abstract:
Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management.Keywords: probabilistic methods, risk assessment, risk management, slope stability
Procedia PDF Downloads 3921529 Timing and Probability of Presurgical Teledermatology: Survival Analysis
Authors: Felipa de Mello-Sampayo
Abstract:
The aim of this study is to undertake, from patient’s perspective, the timing and probability of using teledermatology, comparing it with a conventional referral system. The dynamic stochastic model’s main value-added consists of the concrete application to patients waiting for dermatology surgical intervention. Patients with low health level uncertainty must use teledermatology treatment as soon as possible, which is precisely when the teledermatology is least valuable. The results of the model were then tested empirically with the teledermatology network covering the area served by the Hospital Garcia da Horta, Portugal, links the primary care centers of 24 health districts with the hospital’s dermatology department via the corporate intranet of the Portuguese healthcare system. Health level volatility can be understood as the hazard of developing skin cancer and the trend of health level as the bias of developing skin lesions. The results of the survival analysis suggest that the theoretical model can explain the use of teledermatology. It depends negatively on the volatility of patients' health, and positively on the trend of health, i.e., the lower the risk of developing skin cancer and the younger the patients, the more presurgical teledermatology one expects to occur. Presurgical teledermatology also depends positively on out-of-pocket expenses and negatively on the opportunity costs of teledermatology, i.e., the lower the benefit missed by using teledermatology, the more presurgical teledermatology one expects to occur.Keywords: teledermatology, wait time, uncertainty, opportunity cost, survival analysis
Procedia PDF Downloads 1261528 The Effectiveness of an Occupational Therapy Metacognitive-Functional Intervention for the Improvement of Human Risk Factors of Bus Drivers
Authors: Navah Z. Ratzon, Rachel Shichrur
Abstract:
Background: Many studies have assessed and identified the risk factors of safe driving, but there is relatively little research-based evidence concerning the ability to improve the driving skills of drivers in general and in particular of bus drivers, who are defined as a population at risk. Accidents involving bus drivers can endanger dozens of passengers and cause high direct and indirect damages. Objective: To examine the effectiveness of a metacognitive-functional intervention program for the reduction of risk factors among professional drivers relative to a control group. Methods: The study examined 77 bus drivers working for a large public company in the center of the country, aged 27-69. Twenty-one drivers continued to the intervention stage; four of them dropped out before the end of the intervention. The intervention program we developed was based on previous driving models and the guiding occupational therapy practice framework model in Israel, while adjusting the model to the professional driving in public transportation and its particular risk factors. Treatment focused on raising awareness to safe driving risk factors identified at prescreening (ergonomic, perceptual-cognitive and on-road driving data), with reference to the difficulties that the driver raises and providing coping strategies. The intervention has been customized for each driver and included three sessions of two hours. The effectiveness of the intervention was tested using objective measures: In-Vehicle Data Recorders (IVDR) for monitoring natural driving data, traffic accident data before and after the intervention, and subjective measures (occupational performance questionnaire for bus drivers). Results: Statistical analysis found a significant difference between the degree of change in the rate of IVDR perilous events (t(17)=2.14, p=0.046), before and after the intervention. There was significant difference in the number of accidents per year before and after the intervention in the intervention group (t(17)=2.11, p=0.05), but no significant change in the control group. Subjective ratings of the level of performance and of satisfaction with performance improved in all areas tested following the intervention. The change in the ‘human factors/person’ field, was significant (performance : t=- 2.30, p=0.04; satisfaction with performance : t=-3.18, p=0.009). The change in the ‘driving occupation/tasks’ field, was not significant but showed a tendency toward significance (t=-1.94, p=0.07,). No significant differences were found in driving environment-related variables. Conclusions: The metacognitive-functional intervention significantly improved the objective and subjective measures of safety of bus drivers’ driving. These novel results highlight the potential contribution of occupational therapists, using metacognitive functional treatment, to preventing car accidents among the healthy drivers population and improving the well-being of these drivers. This study also enables familiarity with advanced technologies of IVDR systems and enriches the knowledge of occupational therapists in regards to using a wide variety of driving assessment tools and making the best practice decisions.Keywords: bus drivers, IVDR, human risk factors, metacognitive-functional intervention
Procedia PDF Downloads 3461527 A Multi-Objective Programming Model to Supplier Selection and Order Allocation Problem in Stochastic Environment
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
This paper aims at developing a multi-objective model for supplier selection and order allocation problem in stochastic environment, where purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. In this regard, dependent chance programming is used which maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. The abovementioned stochastic multi-objective programming problem is then transformed into a stochastic single objective programming problem using minimum deviation method. In the next step, the further problem is solved applying a genetic algorithm, which performs a simulation process in order to calculate the stochastic objective function as its fitness function. Finally, the impact of stochastic parameters on the given solution is examined via a sensitivity analysis exploiting coefficient of variation. The results show that whatever stochastic parameters have greater coefficients of variation, the value of the objective function in the stochastic single objective programming problem is deteriorated.Keywords: supplier selection, order allocation, dependent chance programming, genetic algorithm
Procedia PDF Downloads 3131526 The Effect of Training and Development Practice on Employees’ Performance
Authors: Sifen Abreham
Abstract:
Employees are resources in organizations; as such, they need to be trained and developed properly to achieve an organization's goals and expectations. The initial development of the human resource management concept is based on the effective utilization of people to treat them as resources, leading to the realization of business strategies and organizational objectives. The study aimed to assess the effect of training and development practices on employee performance. The researcher used an explanatory research design, which helps to explain, understand, and predict the relationship between variables. To collect the data from the respondents, the study used probability sampling. From the probability, the researcher used stratified random sampling, which can branch off the entire population into homogenous groups. The result was analyzed and presented by using the statistical package for the social science (SPSS) version 26. The major finding of the study was that the training has an impact on employees' job performance to achieve organizational objectives. The district has a policy and procedure for training and development, but it doesn’t apply actively, and it’s not suitable for district-advised reform this policy and procedure and applied actively; the district gives training for the majority of its employees, but most of the time, the training is theoretical the district advised to use practical training method to see positive change, the district gives evaluation after the employees take training and development, but the result is not encouraging the district advised to assess employees skill gap and feel that gap, the district has a budget, but it’s not adequate the district advised to strengthen its financial ground.Keywords: training, development, employees, performance, policy
Procedia PDF Downloads 581525 A Mathematical Analysis of a Model in Capillary Formation: The Roles of Endothelial, Pericyte and Macrophages in the Initiation of Angiogenesis
Authors: Serdal Pamuk, Irem Cay
Abstract:
Our model is based on the theory of reinforced random walks coupled with Michealis-Menten mechanisms which view endothelial cell receptors as the catalysts for transforming both tumor and macrophage derived tumor angiogenesis factor (TAF) into proteolytic enzyme which in turn degrade the basal lamina. The model consists of two main parts. First part has seven differential equations (DE’s) in one space dimension over the capillary, whereas the second part has the same number of DE’s in two space dimensions in the extra cellular matrix (ECM). We connect these two parts via some boundary conditions to move the cells into the ECM in order to initiate capillary formation. But, when does this movement begin? To address this question we estimate the thresholds that activate the transport equations in the capillary. We do this by using steady-state analysis of TAF equation under some assumptions. Once these equations are activated endothelial, pericyte and macrophage cells begin to move into the ECM for the initiation of angiogenesis. We do believe that our results play an important role for the mechanisms of cell migration which are crucial for tumor angiogenesis. Furthermore, we estimate the long time tendency of these three cells, and find that they tend to the transition probability functions as time evolves. We provide our numerical solutions which are in good agreement with our theoretical results.Keywords: angiogenesis, capillary formation, mathematical analysis, steady-state, transition probability function
Procedia PDF Downloads 1561524 The Integrated Strategy of Maintenance with a Scientific Analysis
Authors: Mahmoud Meckawey
Abstract:
This research is dealing with one of the most important aspects of maintenance fields, that is Maintenance Strategy. It's the branch which concerns the concepts and the schematic thoughts in how to manage maintenance and how to deal with the defects in the engineering products (buildings, machines, etc.) in general. Through the papers we will act with the followings: i) The Engineering Product & the Technical Systems: When we act with the maintenance process, in a strategic view, we act with an (engineering product) which consists of multi integrated systems. In fact, there is no engineering product with only one system. We will discuss and explain this topic, through which we will derivate a developed definition for the maintenance process. ii) The factors or basis of the functionality efficiency: That is the main factors affect the functional efficiency of the systems and the engineering products, then by this way we can give a technical definition of defects and how they occur. iii) The legality of occurrence of defects (Legal defects and Illegal defects): with which we assume that all the factors of the functionality efficiency been applied, and then we will discuss the results. iv) The Guarantee, the Functional Span Age and the Technical surplus concepts: In the complementation with the above topic, and associated with the Reliability theorems, where we act with the Probability of Failure state, with which we almost interest with the design stages, that is to check and adapt the design of the elements. But in Maintainability we act in a different way as we act with the actual state of the systems. So, we act with the rest of the story that means we have to act with the complementary part of the probability of failure term which refers to the actual surplus of the functionality for the systems.Keywords: engineering product and technical systems, functional span age, legal and illegal defects, technical and functional surplus
Procedia PDF Downloads 4751523 The Probability of Smallholder Broiler Chicken Farmers' Participation in the Mainstream Market within Maseru District in Lesotho
Authors: L. E. Mphahama, A. Mushunje, A. Taruvinga
Abstract:
Although broiler production does not generate any large incomes among the smallholder community, it represents the main source of livelihood and part of nutritional requirement. As a result, market for broiler meat is growing faster than that of any other meat products and is projected to continue growing in the coming decades. However, the implication is that a multitude of factors manipulates transformation of smallholder broiler farmers participating in the mainstream markets. From 217 smallholder broiler farmers, socio-economic and institutional factors in broiler farming were incorporated into Binary model to estimate the probability of broiler farmers’ participation in the mainstream markets within the Maseru district in Lesotho. Of the thirteen (13) predictor variables fitted into the model, six (6) variables (household size, number of years in broiler business, stock size, access to transport, access to extension services and access to market information) had significant coefficients while seven (7) variables (level of education, marital status, price of broilers, poultry association, access to contract, access to credit and access to storage) did not have a significant impact. It is recommended that smallholder broiler farmers organize themselves into cooperatives which will act as a vehicle through which they can access contracts and formal markets. These cooperatives will also enable easy training and workshops for broiler rearing and marketing/markets through extension visits.Keywords: broiler chicken, mainstream market, Maseru district, participation, smallholder farmers
Procedia PDF Downloads 1521522 Electro-Fenton Degradation of Erythrosine B Using Carbon Felt as a Cathode: Doehlert Design as an Optimization Technique
Authors: Sourour Chaabane, Davide Clematis, Marco Panizza
Abstract:
This study investigates the oxidation of Erythrosine B (EB) food dye by a homogeneous electro-Fenton process using iron (II) sulfate heptahydrate as a catalyst, carbon felt as cathode, and Ti/RuO2. The treated synthetic wastewater contains 100 mg L⁻¹ of EB and has a pH = 3. The effects of three independent variables have been considered for process optimization, such as applied current intensity (0.1 – 0.5 A), iron concentration (1 – 10 mM), and stirring rate (100 – 1000 rpm). Their interactions were investigated considering response surface methodology (RSM) based on Doehlert design as optimization method. EB removal efficiency and energy consumption were considered model responses after 30 minutes of electrolysis. Analysis of variance (ANOVA) revealed that the quadratic model was adequately fitted to the experimental data with R² (0.9819), adj-R² (0.9276) and low Fisher probability (< 0.0181) for EB removal model, and R² (0.9968), adj-R² (0.9872) and low Fisher probability (< 0.0014) relative to the energy consumption model reflected a robust statistical significance. The energy consumption model significantly depends on current density, as expected. The foregoing results obtained by RSM led to the following optimal conditions for EB degradation: current intensity of 0.2 A, iron concentration of 9.397 mM, and stirring rate of 500 rpm, which gave a maximum decolorization rate of 98.15 % with a minimum energy consumption of 0.74 kWh m⁻³ at 30 min of electrolysis.Keywords: electrofenton, erythrosineb, dye, response serface methdology, carbon felt
Procedia PDF Downloads 721521 Learning a Bayesian Network for Situation-Aware Smart Home Service: A Case Study with a Robot Vacuum Cleaner
Authors: Eu Tteum Ha, Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
The smart home environment backed up by IoT (internet of things) technologies enables intelligent services based on the awareness of the situation a user is currently in. One of the convenient sensors for recognizing the situations within a home is the smart meter that can monitor the status of each electrical appliance in real time. This paper aims at learning a Bayesian network that models the causal relationship between the user situations and the status of the electrical appliances. Using such a network, we can infer the current situation based on the observed status of the appliances. However, learning the conditional probability tables (CPTs) of the network requires many training examples that cannot be obtained unless the user situations are closely monitored by any means. This paper proposes a method for learning the CPT entries of the network relying only on the user feedbacks generated occasionally. In our case study with a robot vacuum cleaner, the feedback comes in whenever the user gives an order to the robot adversely from its preprogrammed setting. Given a network with randomly initialized CPT entries, our proposed method uses this feedback information to adjust relevant CPT entries in the direction of increasing the probability of recognizing the desired situations. Simulation experiments show that our method can rapidly improve the recognition performance of the Bayesian network using a relatively small number of feedbacks.Keywords: Bayesian network, IoT, learning, situation -awareness, smart home
Procedia PDF Downloads 5231520 Constructions of Linear and Robust Codes Based on Wavelet Decompositions
Authors: Alla Levina, Sergey Taranov
Abstract:
The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability
Procedia PDF Downloads 4891519 Effectiveness of Variable Speed Limit Signs in Reducing Crash Rates on Roadway Construction Work Zones in Alaska
Authors: Osama Abaza, Tanay Datta Chowdhury
Abstract:
As a driver's speed increases, so do the probability of an incident and likelihood of injury. The presence of equipment, personnel, and a changing landscape in construction zones create greater potential for incident. This is especially concerning in Alaska, where summer construction activity, coinciding with the peak annual traffic volumes, cannot be avoided. In order to reduce vehicular speeding in work zones, and therefore the probability of crash and incident occurrence, variable speed limit (VSL) systems can be implemented in the form of radar speed display trailers since the radar speed display trailers were shown to be effective at reducing vehicular speed in construction zones. Allocation of VSL not only help reduce the 85th percentile speed but also it will predominantly reduce mean speed as well. Total of 2147 incidents along with 385 crashes occurred only in one month around the construction zone in the Alaska which seriously requires proper attention. This research provided a thorough crash analysis to better understand the cause and provide proper countermeasures. Crashes were predominantly recoded as vehicle- object collision and sideswipe type and thus significant amount of crashes fall in the group of no injury to minor injury type in the severity class. But still, 35 major crashes with 7 fatal ones in a one month period require immediate action like the implementation of the VSL system as it proved to be a speed reducer in the construction zone on Alaskan roadways.Keywords: speed, construction zone, crash, severity
Procedia PDF Downloads 2511518 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection
Procedia PDF Downloads 2561517 A Framework Based on Dempster-Shafer Theory of Evidence Algorithm for the Analysis of the TV-Viewers’ Behaviors
Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi
Abstract:
In this paper, we propose an approach of detecting the behavior of the viewers of a TV program in a non-controlled environment. The experiment we propose is based on the use of three types of connected objects (smartphone, smart watch, and a connected remote control). 23 participants were observed while watching their TV programs during three phases: before, during and after watching a TV program. Their behaviors were detected using an approach based on The Dempster Shafer Theory (DST) in two phases. The first phase is to approximate dynamically the mass functions using an approach based on the correlation coefficient. The second phase is to calculate the approximate mass functions. To approximate the mass functions, two approaches have been tested: the first approach was to divide each features data space into cells; each one has a specific probability distribution over the behaviors. The probability distributions were computed statistically (estimated by empirical distribution). The second approach was to predict the TV-viewing behaviors through the use of classifiers algorithms and add uncertainty to the prediction based on the uncertainty of the model. Results showed that mixing the fusion rule with the computation of the initial approximate mass functions using a classifier led to an overall of 96%, 95% and 96% success rate for the first, second and third TV-viewing phase respectively. The results were also compared to those found in the literature. This study aims to anticipate certain actions in order to maintain the attention of TV viewers towards the proposed TV programs with usual connected objects, taking into account the various uncertainties that can be generated.Keywords: Iot, TV-viewing behaviors identification, automatic classification, unconstrained environment
Procedia PDF Downloads 2291516 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life
Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi
Abstract:
Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model
Procedia PDF Downloads 4571515 A Study on the Impact of Employment Status of the Elderly on Their Mental Well-Being in India
Authors: Santosh B. Phad, Priyanka V. Janbandhu, Dhananjay W. Bansod
Abstract:
Population Ageing is a growing concern for the social scientists. There is a higher level of aged male participation compared to elderly females. Now, the critical question is whether participation in work improves the quality of life among the elderly and the impact of working status on the mental well-being of the elderly. While examining these research questions, the present paper focuses on the workforce participation of the elderly and the reasons behind it, additionally, determines the association between employment status and the mental well-being of the elderly. The present study has a base of two data sources. First one is Census of India data, 2001 and 2011, and another one is – the Study on Global Ageing and Adult Health (SAGE), a survey conducted in 2007. To capture the trend of workforce participation elderly Census data is significant and to obtain other information associated with this issue the SAGE data is studied. The research piece consists of univariate and bivariate analysis along with some statistical methods like principal component analysis (PCA) and regression modeling – to investigate the association between workforce participation of elderly and subjective well-being (SWB). The results show that the percentage of elderly participating in the labor market is gradually reducing, but the share of working elderly has increased within the group of overall workers. i.e., the ratio of aged workers to non-aged workers is rising. The findings from survey data specify that there is a considerable share of the elderly in the labor market; three-fourths of the employed elderly enrolled the workforce unwillingly. They are in need of some earnings mainly to afford the medical expenses on their health or the health of their spouse, also to support their family members who are economically inactive. Apart from need, duration of working is another vital aspect for the elderly, whereas more than 80 percent of the elderly are working for six hours or more, and most of them engaged in self-employment. However, more than one-third of the working elderly falls into a negative cluster of the subjective well-being (SWB) index, and it is consistent with the result of the discriminant analysis. Here, the SWB index calculated from the 12 items and the reliability score of these items is 0.89.Keywords: ageing, workforce, census of India, SAGE
Procedia PDF Downloads 1511514 Using Cyclic Structure to Improve Inference on Network Community Structure
Authors: Behnaz Moradijamei, Michael Higgins
Abstract:
Identifying community structure is a critical task in analyzing social media data sets often modeled by networks. Statistical models such as the stochastic block model have proven to explain the structure of communities in real-world network data. In this work, we develop a goodness-of-fit test to examine community structure's existence by using a distinguishing property in networks: cyclic structures are more prevalent within communities than across them. To better understand how communities are shaped by the cyclic structure of the network rather than just the number of edges, we introduce a novel method for deciding on the existence of communities. We utilize these structures by using renewal non-backtracking random walk (RNBRW) to the existing goodness-of-fit test. RNBRW is an important variant of random walk in which the walk is prohibited from returning back to a node in exactly two steps and terminates and restarts once it completes a cycle. We investigate the use of RNBRW to improve the performance of existing goodness-of-fit tests for community detection algorithms based on the spectral properties of the adjacency matrix. Our proposed test on community structure is based on the probability distribution of eigenvalues of the normalized retracing probability matrix derived by RNBRW. We attempt to make the best use of asymptotic results on such a distribution when there is no community structure, i.e., asymptotic distribution under the null hypothesis. Moreover, we provide a theoretical foundation for our statistic by obtaining the true mean and a tight lower bound for RNBRW edge weights variance.Keywords: hypothesis testing, RNBRW, network inference, community structure
Procedia PDF Downloads 1501513 Life Time Improvement of Clamp Structural by Using Fatigue Analysis
Authors: Pisut Boonkaew, Jatuporn Thongsri
Abstract:
In hard disk drive manufacturing industry, the process of reducing an unnecessary part and qualifying the quality of part before assembling is important. Thus, clamp was designed and fabricated as a fixture for holding in testing process. Basically, testing by trial and error consumes a long time to improve. Consequently, the simulation was brought to improve the part and reduce the time taken. The problem is the present clamp has a low life expectancy because of the critical stress that occurred. Hence, the simulation was brought to study the behavior of stress and compressive force to improve the clamp expectancy with all probability of designs which are present up to 27 designs, which excluding the repeated designs. The probability was calculated followed by the full fractional rules of six sigma methodology which was provided correctly. The six sigma methodology is a well-structured method for improving quality level by detecting and reducing the variability of the process. Therefore, the defective will be decreased while the process capability increasing. This research focuses on the methodology of stress and fatigue reduction while compressive force still remains in the acceptable range that has been set by the company. In the simulation, ANSYS simulates the 3D CAD with the same condition during the experiment. Then the force at each distance started from 0.01 to 0.1 mm will be recorded. The setting in ANSYS was verified by mesh convergence methodology and compared the percentage error with the experimental result; the error must not exceed the acceptable range. Therefore, the improved process focuses on degree, radius, and length that will reduce stress and still remain in the acceptable force number. Therefore, the fatigue analysis will be brought as the next process in order to guarantee that the lifetime will be extended by simulating through ANSYS simulation program. Not only to simulate it, but also to confirm the setting by comparing with the actual clamp in order to observe the different of fatigue between both designs. This brings the life time improvement up to 57% compared with the actual clamp in the manufacturing. This study provides a precise and trustable setting enough to be set as a reference methodology for the future design. Because of the combination and adaptation from the six sigma method, finite element, fatigue and linear regressive analysis that lead to accurate calculation, this project will able to save up to 60 million dollars annually.Keywords: clamp, finite element analysis, structural, six sigma, linear regressive analysis, fatigue analysis, probability
Procedia PDF Downloads 2351512 Selecting the Best Risk Exposure to Assess Collision Risks in Container Terminals
Authors: Mohammad Ali Hasanzadeh, Thierry Van Elslander, Eddy Van De Voorde
Abstract:
About 90 percent of world merchandise trade by volume being carried by sea. Maritime transport remains as back bone behind the international trade and globalization meanwhile all seaborne goods need using at least two ports as origin and destination. Amid seaborne traded cargos, container traffic is a prosperous market with about 16% in terms of volume. Albeit containerized cargos are less in terms of tonnage but, containers carry the highest value cargos amongst all. That is why efficient handling of containers in ports is very important. Accidents are the foremost causes that lead to port inefficiency and a surge in total transport cost. Having different port safety management systems (PSMS) in place, statistics on port accidents show that numerous accidents occur in ports. Some of them claim peoples’ life; others damage goods, vessels, port equipment and/or the environment. Several accident investigation illustrate that the most common accidents take place throughout transport operation, it sometimes accounts for 68.6% of all events, therefore providing a safer workplace depends on reducing collision risk. In order to quantify risks at the port area different variables can be used as exposure measurement. One of the main motives for defining and using exposure in studies related to infrastructure is to account for the differences in intensity of use, so as to make comparisons meaningful. In various researches related to handling containers in ports and intermodal terminals, different risk exposures and also the likelihood of each event have been selected. Vehicle collision within the port area (10-7 per kilometer of vehicle distance travelled) and dropping containers from cranes, forklift trucks, or rail mounted gantries (1 x 10-5 per lift) are some examples. According to the objective of the current research, three categories of accidents selected for collision risk assessment; fall of container during ship to shore operation, dropping container during transfer operation and collision between vehicles and objects within terminal area. Later on various consequences, exposure and probability identified for each accident. Hence, reducing collision risks profoundly rely on picking the right risk exposures and probability of selected accidents, to prevent collision accidents in container terminals and in the framework of risk calculations, such risk exposures and probabilities can be useful in assessing the effectiveness of safety programs in ports.Keywords: container terminal, collision, seaborne trade, risk exposure, risk probability
Procedia PDF Downloads 3741511 Teleconnection between El Nino-Southern Oscillation and Seasonal Flow of the Surma River and Possibilities of Long Range Flood Forecasting
Authors: Monika Saha, A. T. M. Hasan Zobeyer, Nasreen Jahan
Abstract:
El Nino-Southern Oscillation (ENSO) is the interaction between atmosphere and ocean in tropical Pacific which causes inconsistent warm/cold weather in tropical central and eastern Pacific Ocean. Due to the impact of climate change, ENSO events are becoming stronger in recent times, and therefore it is very important to study the influence of ENSO in climate studies. Bangladesh, being in the low-lying deltaic floodplain, experiences the worst consequences due to flooding every year. To reduce the catastrophe of severe flooding events, non-structural measures such as flood forecasting can be helpful in taking adequate precautions and steps. Forecasting seasonal flood with a longer lead time of several months is a key component of flood damage control and water management. The objective of this research is to identify the possible strength of teleconnection between ENSO and river flow of Surma and examine the potential possibility of long lead flood forecasting in the wet season. Surma is one of the major rivers of Bangladesh and is a part of the Surma-Meghna river system. In this research, sea surface temperature (SST) has been considered as the ENSO index and the lead time is at least a few months which is greater than the basin response time. The teleconnection has been assessed by the correlation analysis between July-August-September (JAS) flow of Surma and SST of Nino 4 region of the corresponding months. Cumulative frequency distribution of standardized JAS flow of Surma has also been determined as part of assessing the possible teleconnection. Discharge data of Surma river from 1975 to 2015 is used in this analysis, and remarkable increased value of correlation coefficient between flow and ENSO has been observed from 1985. From the cumulative frequency distribution of the standardized JAS flow, it has been marked that in any year the JAS flow has approximately 50% probability of exceeding the long-term average JAS flow. During El Nino year (warm episode of ENSO) this probability of exceedance drops to 23% and while in La Nina year (cold episode of ENSO) it increases to 78%. Discriminant analysis which is known as 'Categoric Prediction' has been performed to identify the possibilities of long lead flood forecasting. It has helped to categorize the flow data (high, average and low) based on the classification of predicted SST (warm, normal and cold). From the discriminant analysis, it has been found that for Surma river, the probability of a high flood in the cold period is 75% and the probability of a low flood in the warm period is 33%. A synoptic parameter, forecasting index (FI) has also been calculated here to judge the forecast skill and to compare different forecasts. This study will help the concerned authorities and the stakeholders to take long-term water resources decisions and formulate policies on river basin management which will reduce possible damage of life, agriculture, and property.Keywords: El Nino-Southern Oscillation, sea surface temperature, surma river, teleconnection, cumulative frequency distribution, discriminant analysis, forecasting index
Procedia PDF Downloads 1541510 Vulnerability Assessment of Reinforced Concrete Frames Based on Inelastic Spectral Displacement
Authors: Chao Xu
Abstract:
Selecting ground motion intensity measures reasonably is one of the very important issues to affect the input ground motions selecting and the reliability of vulnerability analysis results. In this paper, inelastic spectral displacement is used as an alternative intensity measure to characterize the ground motion damage potential. The inelastic spectral displacement is calculated based modal pushover analysis and inelastic spectral displacement based incremental dynamic analysis is developed. Probability seismic demand analysis of a six story and an eleven story RC frame are carried out through cloud analysis and advanced incremental dynamic analysis. The sufficiency and efficiency of inelastic spectral displacement are investigated by means of regression and residual analysis, and compared with elastic spectral displacement. Vulnerability curves are developed based on inelastic spectral displacement. The study shows that inelastic spectral displacement reflects the impact of different frequency components with periods larger than fundamental period on inelastic structural response. The damage potential of ground motion on structures with fundamental period prolonging caused by structural soften can be caught by inelastic spectral displacement. To be compared with elastic spectral displacement, inelastic spectral displacement is a more sufficient and efficient intensity measure, which reduces the uncertainty of vulnerability analysis and the impact of input ground motion selection on vulnerability analysis result.Keywords: vulnerability, probability seismic demand analysis, ground motion intensity measure, sufficiency, efficiency, inelastic time history analysis
Procedia PDF Downloads 3541509 Enhancing the Pricing Expertise of an Online Distribution Channel
Authors: Luis N. Pereira, Marco P. Carrasco
Abstract:
Dynamic pricing is a revenue management strategy in which hotel suppliers define, over time, flexible and different prices for their services for different potential customers, considering the profile of e-consumers and the demand and market supply. This means that the fundamentals of dynamic pricing are based on economic theory (price elasticity of demand) and market segmentation. This study aims to define a dynamic pricing strategy and a contextualized offer to the e-consumers profile in order to improve the number of reservations of an online distribution channel. Segmentation methods (hierarchical and non-hierarchical) were used to identify and validate an optimal number of market segments. A profile of the market segments was studied, considering the characteristics of the e-consumers and the probability of reservation a room. In addition, the price elasticity of demand was estimated for each segment using econometric models. Finally, predictive models were used to define rules for classifying new e-consumers into pre-defined segments. The empirical study illustrates how it is possible to improve the intelligence of an online distribution channel system through an optimal dynamic pricing strategy and a contextualized offer to the profile of each new e-consumer. A database of 11 million e-consumers of an online distribution channel was used in this study. The results suggest that an appropriate policy of market segmentation in using of online reservation systems is benefit for the service suppliers because it brings high probability of reservation and generates more profit than fixed pricing.Keywords: dynamic pricing, e-consumers segmentation, online reservation systems, predictive analytics
Procedia PDF Downloads 234