Search results for: default probability
1107 Energy Detection Based Sensing and Primary User Traffic Classification for Cognitive Radio
Authors: Urvee B. Trivedi, U. D. Dalal
Abstract:
As wireless communication services grow quickly; the seriousness of spectrum utilization has been on the rise gradually. An emerging technology, cognitive radio has come out to solve today’s spectrum scarcity problem. To support the spectrum reuse functionality, secondary users are required to sense the radio frequency environment, and once the primary users are found to be active, the secondary users are required to vacate the channel within a certain amount of time. Therefore, spectrum sensing is of significant importance. Once sensing is done, different prediction rules apply to classify the traffic pattern of primary user. Primary user follows two types of traffic patterns: periodic and stochastic ON-OFF patterns. A cognitive radio can learn the patterns in different channels over time. Two types of classification methods are discussed in this paper, by considering edge detection and by using autocorrelation function. Edge detection method has a high accuracy but it cannot tolerate sensing errors. Autocorrelation-based classification is applicable in the real environment as it can tolerate some amount of sensing errors.Keywords: cognitive radio (CR), probability of detection (PD), probability of false alarm (PF), primary user (PU), secondary user (SU), fast Fourier transform (FFT), signal to noise ratio (SNR)
Procedia PDF Downloads 3451106 Occupational Diseases in the Automotive Industry in Czechia
Authors: J. Jarolímek, P. Urban, P. Pavlínek, D. Dzúrová
Abstract:
The industry constitutes a dominant economic sector in Czechia. The automotive industry represents the most important industrial sector in terms of gross value added and the number of employees. The objective of this study was to analyse the occurrence of occupational diseases (OD) in the automotive industry in Czechia during the 2001-2014 period. Whereas the occurrence of OD in other sectors has generally been decreasing, it has been increasing in the automotive industry, including growing spatial discrepancies. Data on OD cases were retrieved from the National Registry of Occupational Diseases. Further, we conducted a survey in automotive companies with a focus on occupational health services and positions of the companies in global production networks (GPNs). An analysis of OD distribution in the automotive industry was performed (age, gender, company size and its role in GPNs, regional distribution of studied companies, and regional unemployment rate), and was accompanied by an assessment of the quality and range of occupational health services. The employees older than 40 years had nearly 2.5 times higher probability of OD occurrence compared with employees younger than 40 years (OR 2.41; 95% CI: 2.05-2.85). The OD occurrence probability was 3 times higher for women than for men (OR 3.01; 95 % CI: 2.55-3.55). The OD incidence rate was increasing with the size of the company. An association between the OD incidence and the unemployment rate was not confirmed.Keywords: occupational diseases, automotive industry, health geography, unemployment
Procedia PDF Downloads 2511105 Risk and Impact of the COVID-19 Crisis on Real Estate
Authors: Tahmina Akhter
Abstract:
In the present work, we make a study of the repercussions of the pandemic generated by Covid-19 in the real estate market, this disease has affected almost all sectors of the economy across different countries in the world, including the real estate markets. This documentary research, basically focused on the years 2021 and 2022, as we seek to focus on the strongest time of the pandemic. We carried out the study trying to take into account the repercussions throughout the world and that is why the data we analyze takes into account information from all continents as possible. Particularly in the US, Europe and China where the Covid-19 impact has been of such proportions that it has fundamentally affected the housing market for middle-class housing. In addition, a risk has been generated, the investment of this market, due to the fact that companies in the sector have generated losses in certain cases; in the Chinese case, Evergrande, one of the largest companies in the sector, fell into default.Keywords: COVID-19, real estate market, statistics, pandemic
Procedia PDF Downloads 871104 The 10-year Risk of Major Osteoporotic and Hip Fractures Among Indonesian People Living with HIV
Authors: Iqbal Pramukti, Mamat Lukman, Hasniatisari Harun, Kusman Ibrahim
Abstract:
Introduction: People living with HIV had a higher risk of osteoporotic fracture than the general population. The purpose of this study was to predict the 10-year risk of fracture among people living with HIV (PLWH) using FRAX™ and to identify characteristics related to the fracture risk. Methodology: This study consisted of 75 subjects. The ten-year probability of major osteoporotic fractures (MOF) and hip fractures was assessed using the FRAX™ algorithm. A cross-tabulation was used to identify the participant’s characteristics related to fracture risk. Results: The overall mean 10-year probability of fracture was 2.4% (1.7) for MOF and 0.4% (0.3) for hip fractures. For MOF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use showed a higher MOF score than those who were not (3.1 vs. 2.5; 4.6 vs 2.5; and 3.4 vs 2.5, respectively). For HF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use also showed a higher HF score than those who were not (0.5 vs. 0.3; 0.8 vs. 0.3; and 0.5 vs. 0.3, respectively). Conclusions: The 10-year risk of fracture was higher among PLWH with several factors, including the parent’s hip. Fracture history, smoking behavior and glucocorticoid used. Further analysis on determining factors using multivariate regression analysis with a larger sample size is required to confirm the factors associated with the high fracture risk.Keywords: HIV, PLWH, osteoporotic fractures, hip fractures, 10-year risk of fracture, FRAX
Procedia PDF Downloads 491103 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging
Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi
Abstract:
Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA
Procedia PDF Downloads 2811102 An Empirical Investigation into the Effect of Macroeconomic Policy on Economic Growth in Nigeria
Authors: Rakiya Abba
Abstract:
This paper investigates the effect of the money supply, exchange and interest rate on economic growth in Nigeria through the application of Augmented Dickey-Fuller technique in testing the unit root property of the series and Granger causality test of causation between GDP, money supply, the exchange, and interest rate. The results of unit root suggest that all the variables in the model are stationary at 1, 5 and 10 percent level of significance, and the results of Causality suggest that money supply and exchange granger cause IR, the result further reveals two – way causation existed between M2 and EXR while IR granger cause GDP the null hypothesis is rejected and GDP does not granger cause IR as indicated by their probability values of 0.4805 and confirmed by F-statistics values of 0.75483. The results revealed that M2 and EXR do not granger causes GDP, the null hypothesis is accepted at 75percent 18percent respectively as indicated by their probability values of 0.7472 and 0.1830 respectively; also, GDP does not granger cause M2 and EXR. The Johansen cointegration result indicates that despite GDP does not granger cause M2, IR, and EXR, but there existed 1 cointegrating equation, implying the existence of long-run relationship between GDP, M2 IR, and EXR. A major policy implication of this result is that economic growth is function of and money supply and exchange rate, effective monetary policies should direct on manipulating instruments and importance should be placed on justification for adopting a particular policy be rationalized in order to increase growth in economyKeywords: economic growth, money supply, interest rate, exchange rate, causality
Procedia PDF Downloads 2691101 Democratic Political Culture of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok
Authors: Vilasinee Jintalikhitdee, Phusit Phukamchanoad, Sakapas Saengchai
Abstract:
This research aims to study the level of democratic political culture and the factors that affect the democratic political culture of 5th and 6th graders under the authority of Dusit District Office, Bangkok by using stratified sampling for probability sampling and using purposive sampling for non-probability sampling to collect data toward the distribution of questionnaires to 300 respondents. This covers all of the schools under the authority of Dusit District Office. The researcher analyzed the data by using descriptive statistics which include arithmetic mean, standard deviation, and inferential statistics which are Independent Samples T-test (T-test) and One-Way ANOVA (F-test). The researcher also collected data by interviewing the target groups, and then analyzed the data by the use of descriptive analysis. The result shows that 5th and 6th graders under the authority of Dusit District Office, Bangkok have exposed to democratic political culture at high level in overall. When considering each part, it found out that the part that has highest mean is “the constitutional democratic governmental system is suitable for Thailand” statement. The part with the lowest mean is “corruption (cheat and defraud) is normal in Thai society” statement. The factor that affects democratic political culture is grade levels, occupations of mothers, and attention in news and political movements.Keywords: democratic, political culture, political movements, democratic governmental system
Procedia PDF Downloads 2661100 Failure Probability Assessment of Concrete Spherical Domes Subjected to Ventilation Controlled Fires Using BIM Tools
Authors: A. T. Kassem
Abstract:
Fires areconsidered a common hazardous action that any building may face. Most buildings’ structural elements are designed, taking into consideration precautions for fire safety, using deterministic design approaches. Public and highly important buildings are commonly designed considering standard fire rating and, in many cases, contain large compartments with central domes. Real fire scenarios are not commonly brought into action in structural design of buildings because of complexities in both scenarios and analysis tools. This paper presents a modern approach towards analysis of spherical domes in real fire condition via implementation of building information modelling, and adopting a probabilistic approach. BIMhas been implemented to bridge the gap between various software packages enabling them to function interactively to model both real fire and corresponding structural response. Ventilation controlled fires scenarios have been modeled using both “Revit” and “Pyrosim”. Monte Carlo simulation has been adopted to engage the probabilistic analysis approach in dealing with various parameters. Conclusions regarding failure probability and fire endurance, in addition to the effects of various parameters, have been extracted.Keywords: concrete, spherical domes, ventilation controlled fires, BIM, monte carlo simulation, pyrosim, revit
Procedia PDF Downloads 961099 Impact of Violence against Women on Small and Medium Enterprises (SMEs) in Rural Sindh: A Case Study of Kandhkot
Authors: Mohammad Shoaib Khan, Abdul Sattar Bahalkani
Abstract:
This research investigates the violence and their impact on SMEs in Sindh. The main objective of current research is to examine the women empowerment through women participation in small and medium enterprises in upper Sindh. The data were collected from 500 respondents from Kandhkot District, by using simple random technique. A structural questionnaire was designed as an instrument for measuring the impact of SMEs business in women empowerment in rural Sindh. It was revealed that the rural women is less confident and their husbands were always given them hard time once they are exposing themselves to outside the boundaries of the house. It was revealed that rural women have a major contribution in social, economic, and political development. It was further revealed that women are getting low wages and due to non-availability of market facility they are paying low wages. The negative impact of husbands’ income and having children at the age of 0-6 years old are also significant. High income of other household member raises the reservation wage of mothers, thus lowers the probability of participation when the objective of working is to help family’s financial need. The impact of childcare on mothers’ labor force participation is significant but not as the theory predicted. The probability of participation in labor force is significantly higher for women who lived in the urban areas where job opportunities are greater compared to the rural.Keywords: empowerment, violence against women, SMEs, rural
Procedia PDF Downloads 3331098 Merging Appeal to Ignorance, Composition, and Division Argument Schemes with Bayesian Networks
Authors: Kong Ngai Pei
Abstract:
The argument scheme approach to argumentation has two components. One is to identify the recurrent patterns of inferences used in everyday discourse. The second is to devise critical questions to evaluate the inferences in these patterns. Although this approach is intuitive and contains many insightful ideas, it has been noted to be not free of problems. One is that due to its disavowing the probability calculus, it cannot give the exact strength of an inference. In order to tackle this problem, thereby paving the way to a more complete normative account of argument strength, it has been proposed, the most promising way is to combine the scheme-based approach with Bayesian networks (BNs). This paper pursues this line of thought, attempting to combine three common schemes, Appeal to Ignorance, Composition, and Division, with BNs. In the first part, it is argued that most (if not all) formulations of the critical questions corresponding to these schemes in the current argumentation literature are incomplete and not very informative. To remedy these flaws, more thorough and precise formulations of these questions are provided. In the second part, how to use graphical idioms (e.g. measurement and synthesis idioms) to translate the schemes as well as their corresponding critical questions to graphical structure of BNs, and how to define probability tables of the nodes using functions of various sorts are shown. In the final part, it is argued that many misuses of these schemes, traditionally called fallacies with the same names as the schemes, can indeed be adequately accounted for by the BN models proposed in this paper.Keywords: appeal to ignorance, argument schemes, Bayesian networks, composition, division
Procedia PDF Downloads 2881097 Feasibility Study of Wind Energy Potential in Turkey: Case Study of Catalca District in Istanbul
Authors: Mohammed Wadi, Bedri Kekezoglu, Mustafa Baysal, Mehmet Rida Tur, Abdulfetah Shobole
Abstract:
This paper investigates the technical evaluation of the wind potential for present and future investments in Turkey taking into account the feasibility of sites, installments, operation, and maintenance. This evaluation based on the hourly measured wind speed data for the three years 2008–2010 at 30 m height for Çatalca district. These data were obtained from national meteorology station in Istanbul–Republic of Turkey are analyzed in order to evaluate the feasibility of wind power potential and to assure supreme assortment of wind turbines installing for the area of interest. Furthermore, the data are extrapolated and analyzed at 60 m and 80 m regarding the variability of roughness factor. Weibull bi-parameter probability function is used to approximate monthly and annually wind potential and power density based on three calculation methods namely, the approximated, the graphical and the energy pattern factor methods. The annual mean wind power densities were to be 400.31, 540.08 and 611.02 W/m² for 30, 60, and 80 m heights respectively. Simulation results prove that the analyzed area is an appropriate place for constructing large-scale wind farms.Keywords: wind potential in Turkey, Weibull bi-parameter probability function, the approximated method, the graphical method, the energy pattern factor method, capacity factor
Procedia PDF Downloads 2591096 Quality of Service of Transportation Networks: A Hybrid Measurement of Travel Time and Reliability
Authors: Chin-Chia Jane
Abstract:
In a transportation network, travel time refers to the transmission time from source node to destination node, whereas reliability refers to the probability of a successful connection from source node to destination node. With an increasing emphasis on quality of service (QoS), both performance indexes are significant in the design and analysis of transportation systems. In this work, we extend the well-known flow network model for transportation networks so that travel time and reliability are integrated into the QoS measurement simultaneously. In the extended model, in addition to the general arc capacities, each intermediate node has a time weight which is the travel time for per unit of commodity going through the node. Meanwhile, arcs and nodes are treated as binary random variables that switch between operation and failure with associated probabilities. For pre-specified travel time limitation and demand requirement, the QoS of a transportation network is the probability that source can successfully transport the demand requirement to destination while the total transmission time is under the travel time limitation. This work is pioneering, since existing literatures that evaluate travel time reliability via a single optimization path, the proposed QoS focuses the performance of the whole network system. To compute the QoS of transportation networks, we first transfer the extended network model into an equivalent min-cost max-flow network model. In the transferred network, each arc has a new travel time weight which takes value 0. Each intermediate node is replaced by two nodes u and v, and an arc directed from u to v. The newly generated nodes u and v are perfect nodes. The new direct arc has three weights: travel time, capacity, and operation probability. Then the universal set of state vectors is recursively decomposed into disjoint subsets of reliable, unreliable, and stochastic vectors until no stochastic vector is left. The decomposition is made possible by applying existing efficient min-cost max-flow algorithm. Because the reliable subsets are disjoint, QoS can be obtained directly by summing the probabilities of these reliable subsets. Computational experiments are conducted on a benchmark network which has 11 nodes and 21 arcs. Five travel time limitations and five demand requirements are set to compute the QoS value. To make a comparison, we test the exhaustive complete enumeration method. Computational results reveal the proposed algorithm is much more efficient than the complete enumeration method. In this work, a transportation network is analyzed by an extended flow network model where each arc has a fixed capacity, each intermediate node has a time weight, and both arcs and nodes are independent binary random variables. The quality of service of the transportation network is an integration of customer demands, travel time, and the probability of connection. We present a decomposition algorithm to compute the QoS efficiently. Computational experiments conducted on a prototype network show that the proposed algorithm is superior to existing complete enumeration methods.Keywords: quality of service, reliability, transportation network, travel time
Procedia PDF Downloads 2221095 Statistical Analysis of Extreme Flow (Regions of Chlef)
Authors: Bouthiba Amina
Abstract:
The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.Keywords: return period, extreme flow, statistics laws, Gumbel, estimation
Procedia PDF Downloads 791094 Performance Evaluation of a Prioritized, Limited Multi-Server Processor-Sharing System that Includes Servers with Various Capacities
Authors: Yoshiaki Shikata, Nobutane Hanayama
Abstract:
We present a prioritized, limited multi-server processor sharing (PS) system where each server has various capacities, and N (≥2) priority classes are allowed in each PS server. In each prioritized, limited server, different service ratio is assigned to each class request, and the number of requests to be processed is limited to less than a certain number. Routing strategies of such prioritized, limited multi-server PS systems that take into account the capacity of each server are also presented, and a performance evaluation procedure for these strategies is discussed. Practical performance measures of these strategies, such as loss probability, mean waiting time, and mean sojourn time, are evaluated via simulation. In the PS server, at the arrival (or departure) of a request, the extension (shortening) of the remaining sojourn time of each request receiving service can be calculated by using the number of requests of each class and the priority ratio. Utilising a simulation program which executes these events and calculations, the performance of the proposed prioritized, limited multi-server PS rule can be analyzed. From the evaluation results, most suitable routing strategy for the loss or waiting system is clarified.Keywords: processor sharing, multi-server, various capacity, N-priority classes, routing strategy, loss probability, mean sojourn time, mean waiting time, simulation
Procedia PDF Downloads 3321093 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment
Authors: Isabela Moreira Queiroz
Abstract:
Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management.Keywords: probabilistic methods, risk assessment, risk management, slope stability
Procedia PDF Downloads 3921092 Hermeneutical Understanding of 2 Cor. 7:1 in the Light of Igbo Cultural Concept of Purification
Authors: H. E. Amolo
Abstract:
The concepts of pollution or contamination and purification or ritual cleansing are very important concepts among traditional Africans. This is because in relation to human behaviors and attitudes, they constitute on the one hand what could be referred to as moral demands and on the other, what results in the default of such demands. The many taboos which a man has to observe are not to be regarded as things mechanical which do not touch the heart, but that the avoidance is a sacred law respected by the community. In breaking it, you offend the divine power’. Researches have shown that, Africans tenaciously hold the belief that, moral values are based upon the recognition of the divine will and that sin in the community must be expelled if perfect peace is to be enjoyed. Sadly enough, these moral values are gradually eroding in contemporary times. Thus, this study proposal calls for a survey of the passage from an African cultural context; how it can enhance the understanding of the text, as well as how it can complement its scholarly interpretation with the view of institutionalizing the concept of holiness as a means of bringing the people closer to God, and also instilling ethical purity and righteousness.Keywords: cultural practices, Igbo ideology, purification, rituals
Procedia PDF Downloads 3091091 Timing and Probability of Presurgical Teledermatology: Survival Analysis
Authors: Felipa de Mello-Sampayo
Abstract:
The aim of this study is to undertake, from patient’s perspective, the timing and probability of using teledermatology, comparing it with a conventional referral system. The dynamic stochastic model’s main value-added consists of the concrete application to patients waiting for dermatology surgical intervention. Patients with low health level uncertainty must use teledermatology treatment as soon as possible, which is precisely when the teledermatology is least valuable. The results of the model were then tested empirically with the teledermatology network covering the area served by the Hospital Garcia da Horta, Portugal, links the primary care centers of 24 health districts with the hospital’s dermatology department via the corporate intranet of the Portuguese healthcare system. Health level volatility can be understood as the hazard of developing skin cancer and the trend of health level as the bias of developing skin lesions. The results of the survival analysis suggest that the theoretical model can explain the use of teledermatology. It depends negatively on the volatility of patients' health, and positively on the trend of health, i.e., the lower the risk of developing skin cancer and the younger the patients, the more presurgical teledermatology one expects to occur. Presurgical teledermatology also depends positively on out-of-pocket expenses and negatively on the opportunity costs of teledermatology, i.e., the lower the benefit missed by using teledermatology, the more presurgical teledermatology one expects to occur.Keywords: teledermatology, wait time, uncertainty, opportunity cost, survival analysis
Procedia PDF Downloads 1291090 A Multi-Objective Programming Model to Supplier Selection and Order Allocation Problem in Stochastic Environment
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
This paper aims at developing a multi-objective model for supplier selection and order allocation problem in stochastic environment, where purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. In this regard, dependent chance programming is used which maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. The abovementioned stochastic multi-objective programming problem is then transformed into a stochastic single objective programming problem using minimum deviation method. In the next step, the further problem is solved applying a genetic algorithm, which performs a simulation process in order to calculate the stochastic objective function as its fitness function. Finally, the impact of stochastic parameters on the given solution is examined via a sensitivity analysis exploiting coefficient of variation. The results show that whatever stochastic parameters have greater coefficients of variation, the value of the objective function in the stochastic single objective programming problem is deteriorated.Keywords: supplier selection, order allocation, dependent chance programming, genetic algorithm
Procedia PDF Downloads 3131089 The Effect of Training and Development Practice on Employees’ Performance
Authors: Sifen Abreham
Abstract:
Employees are resources in organizations; as such, they need to be trained and developed properly to achieve an organization's goals and expectations. The initial development of the human resource management concept is based on the effective utilization of people to treat them as resources, leading to the realization of business strategies and organizational objectives. The study aimed to assess the effect of training and development practices on employee performance. The researcher used an explanatory research design, which helps to explain, understand, and predict the relationship between variables. To collect the data from the respondents, the study used probability sampling. From the probability, the researcher used stratified random sampling, which can branch off the entire population into homogenous groups. The result was analyzed and presented by using the statistical package for the social science (SPSS) version 26. The major finding of the study was that the training has an impact on employees' job performance to achieve organizational objectives. The district has a policy and procedure for training and development, but it doesn’t apply actively, and it’s not suitable for district-advised reform this policy and procedure and applied actively; the district gives training for the majority of its employees, but most of the time, the training is theoretical the district advised to use practical training method to see positive change, the district gives evaluation after the employees take training and development, but the result is not encouraging the district advised to assess employees skill gap and feel that gap, the district has a budget, but it’s not adequate the district advised to strengthen its financial ground.Keywords: training, development, employees, performance, policy
Procedia PDF Downloads 621088 A Mathematical Analysis of a Model in Capillary Formation: The Roles of Endothelial, Pericyte and Macrophages in the Initiation of Angiogenesis
Authors: Serdal Pamuk, Irem Cay
Abstract:
Our model is based on the theory of reinforced random walks coupled with Michealis-Menten mechanisms which view endothelial cell receptors as the catalysts for transforming both tumor and macrophage derived tumor angiogenesis factor (TAF) into proteolytic enzyme which in turn degrade the basal lamina. The model consists of two main parts. First part has seven differential equations (DE’s) in one space dimension over the capillary, whereas the second part has the same number of DE’s in two space dimensions in the extra cellular matrix (ECM). We connect these two parts via some boundary conditions to move the cells into the ECM in order to initiate capillary formation. But, when does this movement begin? To address this question we estimate the thresholds that activate the transport equations in the capillary. We do this by using steady-state analysis of TAF equation under some assumptions. Once these equations are activated endothelial, pericyte and macrophage cells begin to move into the ECM for the initiation of angiogenesis. We do believe that our results play an important role for the mechanisms of cell migration which are crucial for tumor angiogenesis. Furthermore, we estimate the long time tendency of these three cells, and find that they tend to the transition probability functions as time evolves. We provide our numerical solutions which are in good agreement with our theoretical results.Keywords: angiogenesis, capillary formation, mathematical analysis, steady-state, transition probability function
Procedia PDF Downloads 1571087 The Integrated Strategy of Maintenance with a Scientific Analysis
Authors: Mahmoud Meckawey
Abstract:
This research is dealing with one of the most important aspects of maintenance fields, that is Maintenance Strategy. It's the branch which concerns the concepts and the schematic thoughts in how to manage maintenance and how to deal with the defects in the engineering products (buildings, machines, etc.) in general. Through the papers we will act with the followings: i) The Engineering Product & the Technical Systems: When we act with the maintenance process, in a strategic view, we act with an (engineering product) which consists of multi integrated systems. In fact, there is no engineering product with only one system. We will discuss and explain this topic, through which we will derivate a developed definition for the maintenance process. ii) The factors or basis of the functionality efficiency: That is the main factors affect the functional efficiency of the systems and the engineering products, then by this way we can give a technical definition of defects and how they occur. iii) The legality of occurrence of defects (Legal defects and Illegal defects): with which we assume that all the factors of the functionality efficiency been applied, and then we will discuss the results. iv) The Guarantee, the Functional Span Age and the Technical surplus concepts: In the complementation with the above topic, and associated with the Reliability theorems, where we act with the Probability of Failure state, with which we almost interest with the design stages, that is to check and adapt the design of the elements. But in Maintainability we act in a different way as we act with the actual state of the systems. So, we act with the rest of the story that means we have to act with the complementary part of the probability of failure term which refers to the actual surplus of the functionality for the systems.Keywords: engineering product and technical systems, functional span age, legal and illegal defects, technical and functional surplus
Procedia PDF Downloads 4751086 The Probability of Smallholder Broiler Chicken Farmers' Participation in the Mainstream Market within Maseru District in Lesotho
Authors: L. E. Mphahama, A. Mushunje, A. Taruvinga
Abstract:
Although broiler production does not generate any large incomes among the smallholder community, it represents the main source of livelihood and part of nutritional requirement. As a result, market for broiler meat is growing faster than that of any other meat products and is projected to continue growing in the coming decades. However, the implication is that a multitude of factors manipulates transformation of smallholder broiler farmers participating in the mainstream markets. From 217 smallholder broiler farmers, socio-economic and institutional factors in broiler farming were incorporated into Binary model to estimate the probability of broiler farmers’ participation in the mainstream markets within the Maseru district in Lesotho. Of the thirteen (13) predictor variables fitted into the model, six (6) variables (household size, number of years in broiler business, stock size, access to transport, access to extension services and access to market information) had significant coefficients while seven (7) variables (level of education, marital status, price of broilers, poultry association, access to contract, access to credit and access to storage) did not have a significant impact. It is recommended that smallholder broiler farmers organize themselves into cooperatives which will act as a vehicle through which they can access contracts and formal markets. These cooperatives will also enable easy training and workshops for broiler rearing and marketing/markets through extension visits.Keywords: broiler chicken, mainstream market, Maseru district, participation, smallholder farmers
Procedia PDF Downloads 1521085 Structural and Functional Correlates of Reaction Time Variability in a Large Sample of Healthy Adolescents and Adolescents with ADHD Symptoms
Authors: Laura O’Halloran, Zhipeng Cao, Clare M. Kelly, Hugh Garavan, Robert Whelan
Abstract:
Reaction time (RT) variability on cognitive tasks provides the index of the efficiency of executive control processes (e.g. attention and inhibitory control) and is considered to be a hallmark of clinical disorders, such as attention-deficit disorder (ADHD). Increased RT variability is associated with structural and functional brain differences in children and adults with various clinical disorders, as well as poorer task performance accuracy. Furthermore, the strength of functional connectivity across various brain networks, such as the negative relationship between the task-negative default mode network and task-positive attentional networks, has been found to reflect differences in RT variability. Although RT variability may provide an index of attentional efficiency, as well as being a useful indicator of neurological impairment, the brain substrates associated with RT variability remain relatively poorly defined, particularly in a healthy sample. Method: Firstly, we used the intra-individual coefficient of variation (ICV) as an index of RT variability from “Go” responses on the Stop Signal Task. We then examined the functional and structural neural correlates of ICV in a large sample of 14-year old healthy adolescents (n=1719). Of these, a subset had elevated symptoms of ADHD (n=80) and was compared to a matched non-symptomatic control group (n=80). The relationship between brain activity during successful and unsuccessful inhibitions and gray matter volume were compared with the ICV. A mediation analysis was conducted to examine if specific brain regions mediated the relationship between ADHD symptoms and ICV. Lastly, we looked at functional connectivity across various brain networks and quantified both positive and negative correlations during “Go” responses on the Stop Signal Task. Results: The brain data revealed that higher ICV was associated with increased structural and functional brain activation in the precentral gyrus in the whole sample and in adolescents with ADHD symptoms. Lower ICV was associated with lower activation in the anterior cingulate cortex (ACC) and medial frontal gyrus in the whole sample and in the control group. Furthermore, our results indicated that activation in the precentral gyrus (Broadman Area 4) mediated the relationship between ADHD symptoms and behavioural ICV. Conclusion: This is the first study first to investigate the functional and structural correlates of ICV collectively in a large adolescent sample. Our findings demonstrate a concurrent increase in brain structure and function within task-active prefrontal networks as a function of increased RT variability. Furthermore, structural and functional brain activation patterns in the ACC, and medial frontal gyrus plays a role-optimizing top-down control in order to maintain task performance. Our results also evidenced clear differences in brain morphometry between adolescents with symptoms of ADHD but without clinical diagnosis and typically developing controls. Our findings shed light on specific functional and structural brain regions that are implicated in ICV and yield insights into effective cognitive control in healthy individuals and in clinical groups.Keywords: ADHD, fMRI, reaction-time variability, default mode, functional connectivity
Procedia PDF Downloads 2571084 Electro-Fenton Degradation of Erythrosine B Using Carbon Felt as a Cathode: Doehlert Design as an Optimization Technique
Authors: Sourour Chaabane, Davide Clematis, Marco Panizza
Abstract:
This study investigates the oxidation of Erythrosine B (EB) food dye by a homogeneous electro-Fenton process using iron (II) sulfate heptahydrate as a catalyst, carbon felt as cathode, and Ti/RuO2. The treated synthetic wastewater contains 100 mg L⁻¹ of EB and has a pH = 3. The effects of three independent variables have been considered for process optimization, such as applied current intensity (0.1 – 0.5 A), iron concentration (1 – 10 mM), and stirring rate (100 – 1000 rpm). Their interactions were investigated considering response surface methodology (RSM) based on Doehlert design as optimization method. EB removal efficiency and energy consumption were considered model responses after 30 minutes of electrolysis. Analysis of variance (ANOVA) revealed that the quadratic model was adequately fitted to the experimental data with R² (0.9819), adj-R² (0.9276) and low Fisher probability (< 0.0181) for EB removal model, and R² (0.9968), adj-R² (0.9872) and low Fisher probability (< 0.0014) relative to the energy consumption model reflected a robust statistical significance. The energy consumption model significantly depends on current density, as expected. The foregoing results obtained by RSM led to the following optimal conditions for EB degradation: current intensity of 0.2 A, iron concentration of 9.397 mM, and stirring rate of 500 rpm, which gave a maximum decolorization rate of 98.15 % with a minimum energy consumption of 0.74 kWh m⁻³ at 30 min of electrolysis.Keywords: electrofenton, erythrosineb, dye, response serface methdology, carbon felt
Procedia PDF Downloads 741083 Learning a Bayesian Network for Situation-Aware Smart Home Service: A Case Study with a Robot Vacuum Cleaner
Authors: Eu Tteum Ha, Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
The smart home environment backed up by IoT (internet of things) technologies enables intelligent services based on the awareness of the situation a user is currently in. One of the convenient sensors for recognizing the situations within a home is the smart meter that can monitor the status of each electrical appliance in real time. This paper aims at learning a Bayesian network that models the causal relationship between the user situations and the status of the electrical appliances. Using such a network, we can infer the current situation based on the observed status of the appliances. However, learning the conditional probability tables (CPTs) of the network requires many training examples that cannot be obtained unless the user situations are closely monitored by any means. This paper proposes a method for learning the CPT entries of the network relying only on the user feedbacks generated occasionally. In our case study with a robot vacuum cleaner, the feedback comes in whenever the user gives an order to the robot adversely from its preprogrammed setting. Given a network with randomly initialized CPT entries, our proposed method uses this feedback information to adjust relevant CPT entries in the direction of increasing the probability of recognizing the desired situations. Simulation experiments show that our method can rapidly improve the recognition performance of the Bayesian network using a relatively small number of feedbacks.Keywords: Bayesian network, IoT, learning, situation -awareness, smart home
Procedia PDF Downloads 5241082 Constructions of Linear and Robust Codes Based on Wavelet Decompositions
Authors: Alla Levina, Sergey Taranov
Abstract:
The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability
Procedia PDF Downloads 4911081 Effectiveness of Variable Speed Limit Signs in Reducing Crash Rates on Roadway Construction Work Zones in Alaska
Authors: Osama Abaza, Tanay Datta Chowdhury
Abstract:
As a driver's speed increases, so do the probability of an incident and likelihood of injury. The presence of equipment, personnel, and a changing landscape in construction zones create greater potential for incident. This is especially concerning in Alaska, where summer construction activity, coinciding with the peak annual traffic volumes, cannot be avoided. In order to reduce vehicular speeding in work zones, and therefore the probability of crash and incident occurrence, variable speed limit (VSL) systems can be implemented in the form of radar speed display trailers since the radar speed display trailers were shown to be effective at reducing vehicular speed in construction zones. Allocation of VSL not only help reduce the 85th percentile speed but also it will predominantly reduce mean speed as well. Total of 2147 incidents along with 385 crashes occurred only in one month around the construction zone in the Alaska which seriously requires proper attention. This research provided a thorough crash analysis to better understand the cause and provide proper countermeasures. Crashes were predominantly recoded as vehicle- object collision and sideswipe type and thus significant amount of crashes fall in the group of no injury to minor injury type in the severity class. But still, 35 major crashes with 7 fatal ones in a one month period require immediate action like the implementation of the VSL system as it proved to be a speed reducer in the construction zone on Alaskan roadways.Keywords: speed, construction zone, crash, severity
Procedia PDF Downloads 2531080 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection
Procedia PDF Downloads 2561079 A Framework Based on Dempster-Shafer Theory of Evidence Algorithm for the Analysis of the TV-Viewers’ Behaviors
Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi
Abstract:
In this paper, we propose an approach of detecting the behavior of the viewers of a TV program in a non-controlled environment. The experiment we propose is based on the use of three types of connected objects (smartphone, smart watch, and a connected remote control). 23 participants were observed while watching their TV programs during three phases: before, during and after watching a TV program. Their behaviors were detected using an approach based on The Dempster Shafer Theory (DST) in two phases. The first phase is to approximate dynamically the mass functions using an approach based on the correlation coefficient. The second phase is to calculate the approximate mass functions. To approximate the mass functions, two approaches have been tested: the first approach was to divide each features data space into cells; each one has a specific probability distribution over the behaviors. The probability distributions were computed statistically (estimated by empirical distribution). The second approach was to predict the TV-viewing behaviors through the use of classifiers algorithms and add uncertainty to the prediction based on the uncertainty of the model. Results showed that mixing the fusion rule with the computation of the initial approximate mass functions using a classifier led to an overall of 96%, 95% and 96% success rate for the first, second and third TV-viewing phase respectively. The results were also compared to those found in the literature. This study aims to anticipate certain actions in order to maintain the attention of TV viewers towards the proposed TV programs with usual connected objects, taking into account the various uncertainties that can be generated.Keywords: Iot, TV-viewing behaviors identification, automatic classification, unconstrained environment
Procedia PDF Downloads 2291078 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life
Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi
Abstract:
Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model
Procedia PDF Downloads 457