Search results for: probability of default
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1342

Search results for: probability of default

652 Factors Affecting Students' Performance in the Examination

Authors: Amylyn F. Labasano

Abstract:

A significant number of empirical studies are carried out to investigate factors affecting college students’ performance in the academic examination. With a wide-array of literature-and studies-supported findings, this study is limited only on the students’ probability of passing periodical exams which is associated with students’ gender, absences in the class, use of reference book, and hours of study. Binary logistic regression was the technique used in the analysis. The research is based on the students’ record and data collected through survey. The result reveals that gender, use of reference book and hours of study are significant predictors of passing an examination while students’ absenteeism is an insignificant predictor. Females have 45% likelihood of passing the exam than their male classmates. Students who use and read their reference book are 38 times more likely pass the exam than those who do not use and read their reference book. Those who spent more than 3 hours in studying are four (4) times more likely pass the exam than those who spent only 3 hours or less in studying.

Keywords: absences, binary logistic regression, gender, hours of study prediction-causation method, periodical exams, random sampling, reference book

Procedia PDF Downloads 288
651 A Mixture Vine Copula Structures Model for Dependence Wind Speed among Wind Farms and Its Application in Reactive Power Optimization

Authors: Yibin Qiu, Yubo Ouyang, Shihan Li, Guorui Zhang, Qi Li, Weirong Chen

Abstract:

This paper aims at exploring the impacts of high dimensional dependencies of wind speed among wind farms on probabilistic optimal power flow. To obtain the reactive power optimization faster and more accurately, a mixture vine Copula structure model combining the K-means clustering, C vine copula and D vine copula is proposed in this paper, through which a more accurate correlation model can be obtained. Moreover, a Modified Backtracking Search Algorithm (MBSA), the three-point estimate method is applied to probabilistic optimal power flow. The validity of the mixture vine copula structure model and the MBSA are respectively tested in IEEE30 node system with measured data of 3 adjacent wind farms in a certain area, and the results indicate effectiveness of these methods.

Keywords: mixture vine copula structure model, three-point estimate method, the probability integral transform, modified backtracking search algorithm, reactive power optimization

Procedia PDF Downloads 237
650 DTI Connectome Changes in the Acute Phase of Aneurysmal Subarachnoid Hemorrhage Improve Outcome Classification

Authors: Sarah E. Nelson, Casey Weiner, Alexander Sigmon, Jun Hua, Haris I. Sair, Jose I. Suarez, Robert D. Stevens

Abstract:

Graph-theoretical information from structural connectomes indicated significant connectivity changes and improved acute prognostication in a Random Forest (RF) model in aneurysmal subarachnoid hemorrhage (aSAH), which can lead to significant morbidity and mortality and has traditionally been fraught by poor methods to predict outcome. This study’s hypothesis was that structural connectivity changes occur in canonical brain networks of acute aSAH patients, and that these changes are associated with functional outcome at six months. In a prospective cohort of patients admitted to a single institution for management of acute aSAH, patients underwent diffusion tensor imaging (DTI) as part of a multimodal MRI scan. A weighted undirected structural connectome was created of each patient’s images using Constant Solid Angle (CSA) tractography, with 176 regions of interest (ROIs) defined by the Johns Hopkins Eve atlas. ROIs were sorted into four networks: Default Mode Network, Executive Control Network, Salience Network, and Whole Brain. The resulting nodes and edges were characterized using graph-theoretic features, including Node Strength (NS), Betweenness Centrality (BC), Network Degree (ND), and Connectedness (C). Clinical (including demographics and World Federation of Neurologic Surgeons scale) and graph features were used separately and in combination to train RF and Logistic Regression classifiers to predict two outcomes: dichotomized modified Rankin Score (mRS) at discharge and at six months after discharge (favorable outcome mRS 0-2, unfavorable outcome mRS 3-6). A total of 56 aSAH patients underwent DTI a median (IQR) of 7 (IQR=8.5) days after admission. The best performing model (RF) combining clinical and DTI graph features had a mean Area Under the Receiver Operator Characteristic Curve (AUROC) of 0.88 ± 0.00 and Area Under the Precision Recall Curve (AUPRC) of 0.95 ± 0.00 over 500 trials. The combined model performed better than the clinical model alone (AUROC 0.81 ± 0.01, AUPRC 0.91 ± 0.00). The highest-ranked graph features for prediction were NS, BC, and ND. These results indicate reorganization of the connectome early after aSAH. The performance of clinical prognostic models was increased significantly by the inclusion of DTI-derived graph connectivity metrics. This methodology could significantly improve prognostication of aSAH.

Keywords: connectomics, diffusion tensor imaging, graph theory, machine learning, subarachnoid hemorrhage

Procedia PDF Downloads 167
649 Evaluation of the Standard Practice of Availability of Anti-Tuberculosis Drugs in Community Pharmacies

Authors: Udaykumar R., M. S. Ganachari

Abstract:

In order to engage community pharmacies in Tuberculosis care, a survey has been conducted in Belgaum city, Karnataka state, India. After the survey divided into two groups one is control group and another one is intervention group. One is dispensing of anti-tuberculosis drugs, and another one is non-availability of anti-tuberculosis drugs. Those community pharmacists who are voluntarily interesting for becoming DOTS (Directly observed treatment short course) provider and RNTCP (Revised national tuberculosis control programme) objectives. Structured training is conducted for community pharmacist who are dispensing anti-tuberculosis drugs. The training module includes record maintaining, reporting to the RNTCP, Medication adherence etc. In case of non-availability of anti-tuberculosis drugs, the district RNTCP has been given training for community pharmacist by providing free of drugs to the community pharmacies. So, community pharmacies can dispense anti-tuberculosis drugs to the patients. The target of this study is Private community pharmacies. A simple random sampling method is used and 550 private community pharmacy shops has been involved in Belgaum city of Karnataka state, India. Significance of the Study: This study mainly focused on training of DOTS (Directly observed treatment short course) to the private community pharmacist. Indian Govt. Considers Private Providers as Assets for TB Control and Care to Achieve National Strategic Plan for TB Elimination 2017-2025. The Govt. has not fully tapped the Potential of Private Pharmacies to Fight TB. Providing DOTS as per patient’s convenience through community DOT Providers with periodic monitoring may reduce the treatment Default. We explore RNTCP objectives interventions that can have directly managed by private community pharmacy shop. Objectives: Survey of anti-tuberculosis drugs in Community pharmacy shop in Belgaum city. Interested community pharmacist who are willing to become DOTS (Directly observed treatment short course) Provider. Major Findings:Most of the community pharmacist are dispensing anti-tuberculosis drugs without having knowledge of DOTS therapy and RNTCP objectives. No community pharmacist is aware of RNTCP and Tuberculosis burden in India. Most of the Pharmacist agreed to come for RNTCP Training module for the community pharmacist. Some of the community pharmacist not dispensing anti-tuberculosis drugs and they agreed to become official DOTS provider. Concluding Statement: Awareness of role of community pharmacist on tuberculosis control and care has been neglected. More than 50% of tuberculosis patients seeking treatments from privatesector. In this study finds the major gap between government and private sector on tuberculosis treatment.

Keywords: community pharmacist, directly observed treatment short course(DOTS), revised national tuberculosis control programme (RNTCP), private pharmacies, anti-tuberculosis drugs

Procedia PDF Downloads 106
648 Life Cycle Cost Evaluation of Structures Retrofitted with Damped Cable System

Authors: Asad Naeem, Mohamed Nour Eldin, Jinkoo Kim

Abstract:

In this study, the seismic performance and life cycle cost (LCC) are evaluated of the structure retrofitted with the damped cable system (DCS). The DCS is a seismic retrofit system composed of a high-strength steel cable and pressurized viscous dampers. The analysis model of the system is first derived using various link elements in SAP2000, and fragility curves of the structure retrofitted with the DCS and viscous dampers are obtained using incremental dynamic analyses. The analysis results show that the residual displacements of the structure equipped with the DCS are smaller than those of the structure with retrofitted with only conventional viscous dampers, due to the enhanced stiffness/strength and self-centering capability of the damped cable system. The fragility analysis shows that the structure retrofitted with the DCS has the least probability of reaching the specific limit states compared to the bare structure and the structure with viscous damper. It is also observed that the initial cost of the DCS method required for the seismic retrofit is smaller than that of the structure with viscous dampers and that the LCC of the structure equipped with the DCS is smaller than that of the structure with viscous dampers.

Keywords: damped cable system, fragility curve, life cycle cost, seismic retrofit, self-centering

Procedia PDF Downloads 529
647 Evaluation and Fault Classification for Healthcare Robot during Sit-To-Stand Performance through Center of Pressure

Authors: Tianyi Wang, Hieyong Jeong, An Guo, Yuko Ohno

Abstract:

Healthcare robot for assisting sit-to-stand (STS) performance had aroused numerous research interests. To author’s best knowledge, knowledge about how evaluating healthcare robot is still unknown. Robot should be labeled as fault if users feel demanding during STS when they are assisted by robot. In this research, we aim to propose a method to evaluate sit-to-stand assist robot through center of pressure (CoP), then classify different STS performance. Experiments were executed five times with ten healthy subjects under four conditions: two self-performed STSs with chair heights of 62 cm and 43 cm, and two robot-assisted STSs with chair heights of 43 cm and robot end-effect speed of 2 s and 5 s. CoP was measured using a Wii Balance Board (WBB). Bayesian classification was utilized to classify STS performance. The results showed that faults occurred when decreased the chair height and slowed robot assist speed. Proposed method for fault classification showed high probability of classifying fault classes form others. It was concluded that faults for STS assist robot could be detected by inspecting center of pressure and be classified through proposed classification algorithm.

Keywords: center of pressure, fault classification, healthcare robot, sit-to-stand movement

Procedia PDF Downloads 176
646 Simulation as a Problem-Solving Spotter for System Reliability

Authors: Wheyming Tina Song, Chi-Hao Hong, Peisyuan Lin

Abstract:

An important performance measure for stochastic manufacturing networks is the system reliability, defined as the probability that the production output meets or exceeds a specified demand. The system parameters include the capacity of each workstation and numbers of the conforming parts produced in each workstation. We establish that eighteen archival publications, containing twenty-one examples, provide incorrect values of the system reliability. The author recently published the Song Rule, which provides the correct analytical system-reliability value; it is, however, computationally inefficient for large networks. In this paper, we use Monte Carlo simulation (implemented in C and Flexsim) to provide estimates for the above-mentioned twenty-one examples. The simulation estimates are consistent with the analytical solution for small networks but is computationally efficient for large networks. We argue here for three advantages of Monte Carlo simulation: (1) understanding stochastic systems, (2) validating analytical results, and (3) providing estimates even when analytical and numerical approaches are overly expensive in computation. Monte Carlo simulation could have detected the published analysis errors.

Keywords: Monte Carlo simulation, analytical results, leading digit rule, standard error

Procedia PDF Downloads 337
645 Performance Evaluation of MIMO-OFDM Communication Systems

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

This paper evaluates the bit error rate (BER) performance of MIMO-OFDM communication system. MIMO system uses multiple transmitting and receiving antennas with different coding techniques to either enhance the transmission diversity or spatial multiplexing gain. Utilizing alamouti algorithm were the same information transmitted over multiple antennas at different time intervals and then collected again at the receivers to minimize the probability of error, combat fading and thus improve the received signal to noise ratio. While utilizing V-BLAST algorithm, the transmitted signals are divided into different transmitting channels and transferred over the channel to be received by different receiving antennas to increase the transmitted data rate and achieve higher throughput. The paper provides a study of different diversity gain coding schemes and spatial multiplexing coding for MIMO systems. A comparison of various channels' estimation and equalization techniques are given. The simulation is implemented using MATLAB, and the results had shown the performance of transmission models under different channel environments.

Keywords: MIMO communication, BER, space codes, channels, alamouti, V-BLAST

Procedia PDF Downloads 158
644 The Analysis of Application of Green Bonds in New Energy Vehicles in China: From Evolutionary Game Theory

Authors: Jing Zhang

Abstract:

Sustainable development in the new energy vehicles field is the requirement of the net zero aim. Green bonds are accepted as a practical financial tool to boost the transformation of relevant enterprises. The paper analyzes the interactions among governments, enterprises of new energy vehicles, and financial institutions by an evolutionary game theory model and offers advice to stakeholders in China. The decision-making subjects of green behavior are affected by experiences, interests, perception ability, and risk preference, so it is difficult for them to be completely rational. Based on the bounded rationality hypothesis, this paper applies prospect theory in the evolutionary game analysis framework and analyses the costs of government regulation of enterprises adopting green bonds. The influence of the perceived value of revenue prospect and the probability and risk transfer coefficient of the government's active regulation on the decision-making agent's strategy is verified by numerical simulation. Finally, according to the research conclusions, policy suggestions are given to promote green bonds.

Keywords: green bonds, new energy vehicles, sustainable development, evolutionary Game Theory model

Procedia PDF Downloads 58
643 Mikrophonie I (1964) by Karlheinz Stockhausen - Between Idea and Auditory Image

Authors: Justyna Humięcka-Jakubowska

Abstract:

1. Background in music analysis. Traditionally, when we think about a composer’s sketches, the chances are that we are thinking in terms of the working out of detail, rather than the evolution of an overall concept. Since music is a “time art’, it follows that questions of a form cannot be entirely detached from considerations of time. One could say that composers tend to regard time either as a place gradually and partially intuitively filled, or they can look for a specific strategy to occupy it. In my opinion, one thing that sheds light on Stockhausen's compositional thinking is his frequent use of 'form schemas', that is often a single-page representation of the entire structure of a piece. 2. Background in music technology. Sonic Visualiser is a program used to study a musical recording. It is an open source application for viewing, analysing, and annotating music audio files. It contains a number of visualisation tools, which are designed with useful default parameters for musical analysis. Additionally, the Vamp plugin format of SV supports to provide analysis such as for example structural segmentation. 3. Aims. The aim of my paper is to show how SV may be used to obtain a better understanding of the specific musical work, and how the compositional strategy does impact on musical structures and musical surfaces. I want to show that ‘traditional” music analytic methods don’t allow to indicate interrelationships between musical surface (which is perceived) and underlying musical/acoustical structure. 4. Main Contribution. Stockhausen had dealt with the most diverse musical problems by the most varied methods. A characteristic which he had never ceased to be placed at the center of his thought and works, it was the quest for a new balance founded upon an acute connection between speculation and intuition. In the case with Mikrophonie I (1964) for tam-tam and 6 players Stockhausen makes a distinction between the "connection scheme", which indicates the ground rules underlying all versions, and the form scheme, which is associated with a particular version. The preface to the published score includes both the connection scheme, and a single instance of a "form scheme", which is what one can hear on the CD recording. In the current study, the insight into the compositional strategy chosen by Stockhausen was been compared with auditory image, that is, with the perceived musical surface. Stockhausen's musical work is analyzed both in terms of melodic/voice and timbre evolution. 5. Implications The current study shows how musical structures have determined of musical surface. My general assumption is this, that while listening to music we can extract basic kinds of musical information from musical surfaces. It is shown that an interactive strategies of musical structure analysis can offer a very fruitful way of looking directly into certain structural features of music.

Keywords: automated analysis, composer's strategy, mikrophonie I, musical surface, stockhausen

Procedia PDF Downloads 280
642 Weighted Risk Scores Method Proposal for Occupational Safety Risk Assessment

Authors: Ulas Cinar, Omer Faruk Ugurlu, Selcuk Cebi

Abstract:

Occupational safety risk management is the most important element of a safe working environment. Effective risk management can only be possible with accurate analysis and evaluations. Scoring-based risk assessment methods offer considerable ease of application as they convert linguistic expressions into numerical results. It can also be easily adapted to any field. Contrary to all these advantages, important problems in scoring-based methods are frequently discussed. Effective measurability is one of the most critical problems. Existing methods allow experts to choose a score equivalent to each parameter. Therefore, experts prefer the score of the most likely outcome for risk. However, all other possible consequences are neglected. Assessments of the existing methods express the most probable level of risk, not the real risk of the enterprises. In this study, it is aimed to develop a method that will present a more comprehensive evaluation compared to the existing methods by evaluating the probability and severity scores, all sub-parameters, and potential results, and a new scoring-based method is proposed in the literature.

Keywords: occupational health and safety, risk assessment, scoring based risk assessment method, underground mining, weighted risk scores

Procedia PDF Downloads 120
641 Local Spectrum Feature Extraction for Face Recognition

Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh

Abstract:

This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.

Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret

Procedia PDF Downloads 638
640 Theta-Phase Gamma-Amplitude Coupling as a Neurophysiological Marker in Neuroleptic-Naive Schizophrenia

Authors: Jun Won Kim

Abstract:

Objective: Theta-phase gamma-amplitude coupling (TGC) was used as a novel evidence-based tool to reflect the dysfunctional cortico-thalamic interaction in patients with schizophrenia. However, to our best knowledge, no studies have reported the diagnostic utility of the TGC in the resting-state electroencephalographic (EEG) of neuroleptic-naive patients with schizophrenia compared to healthy controls. Thus, the purpose of this EEG study was to understand the underlying mechanisms in patients with schizophrenia by comparing the TGC at rest between two groups and to evaluate the diagnostic utility of TGC. Method: The subjects included 90 patients with schizophrenia and 90 healthy controls. All patients were diagnosed with schizophrenia according to the criteria of Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) by two independent psychiatrists using semi-structured clinical interviews. Because patients were either drug-naïve (first episode) or had not been taking psychoactive drugs for one month before the study, we could exclude the influence of medications. Five frequency bands were defined for spectral analyses: delta (1–4 Hz), theta (4–8 Hz), slow alpha (8–10 Hz), fast alpha (10–13.5 Hz), beta (13.5–30 Hz), and gamma (30-80 Hz). The spectral power of the EEG data was calculated with fast Fourier Transformation using the 'spectrogram.m' function of the signal processing toolbox in Matlab. An analysis of covariance (ANCOVA) was performed to compare the TGC results between the groups, which were adjusted using a Bonferroni correction (P < 0.05/19 = 0.0026). Receiver operator characteristic (ROC) analysis was conducted to examine the discriminating ability of the TGC data for schizophrenia diagnosis. Results: The patients with schizophrenia showed a significant increase in the resting-state TGC at all electrodes. The delta, theta, slow alpha, fast alpha, and beta powers showed low accuracies of 62.2%, 58.4%, 56.9%, 60.9%, and 59.0%, respectively, in discriminating the patients with schizophrenia from the healthy controls. The ROC analysis performed on the TGC data generated the most accurate result among the EEG measures, displaying an overall classification accuracy of 92.5%. Conclusion: As TGC includes phase, which contains information about neuronal interactions from the EEG recording, TGC is expected to be useful for understanding the mechanisms the dysfunctional cortico-thalamic interaction in patients with schizophrenia. The resting-state TGC value was increased in the patients with schizophrenia compared to that in the healthy controls and had a higher discriminating ability than the other parameters. These findings may be related to the compensatory hyper-arousal patterns of the dysfunctional default-mode network (DMN) in schizophrenia. Further research exploring the association between TGC and medical or psychiatric conditions that may confound EEG signals will help clarify the potential utility of TGC.

Keywords: quantitative electroencephalography (QEEG), theta-phase gamma-amplitude coupling (TGC), schizophrenia, diagnostic utility

Procedia PDF Downloads 119
639 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types

Authors: Chaghoub Soraya, Zhang Xiaoyan

Abstract:

This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.

Keywords: approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median

Procedia PDF Downloads 172
638 Probabilistic Modeling of Post-Liquefaction Ground Deformation

Authors: Javad Sadoghi Yazdi, Robb Eric S. Moss

Abstract:

This paper utilizes a probabilistic liquefaction triggering method for modeling post-liquefaction ground deformation. This cone penetration test CPT-based liquefaction triggering is employed to estimate the factor of safety against liquefaction (FSL) and compute the maximum cyclic shear strain (γmax). The study identifies a maximum PL value of 90% across various relative densities, which challenges the decrease from 90% to 70% as relative density decreases. It reveals that PL ranges from 5% to 50% for volumetric strain (εvol) less than 1%, while for εvol values between 1% and 3.2%, PL spans from 50% to 90%. The application of the CPT-based simplified liquefaction triggering procedures has been employed in previous researches to estimate liquefaction ground-failure indices, such as the Liquefaction Potential Index (LPI) and Liquefaction Severity Number (LSN). However, several studies have been conducted to highlight the variability in liquefaction probability calculations, suggesting a more accurate depiction of liquefaction likelihood. Consequently, the utilization of these simplified methods may not offer practical efficiency. This paper further investigates the efficacy of various established liquefaction vulnerability parameters, including LPI and LSN, in explaining the observed liquefaction-induced damage within residential zones of Christchurch, New Zealand using results from CPT database.

Keywords: cone penetration test (CPT), liquefaction, postliquefaction, ground failure

Procedia PDF Downloads 41
637 Customers' Perception towards the Service Marketing Mix and Frequency of Use of Mercedes Benz Automobile Service, Thailand

Authors: Pranee Tridhoskul

Abstract:

This research paper is aimed to examine a relationship between the service marketing mix and customers’ frequency of use of service at Mercedes Benz Auto Repair Centres under Thonburi Group, Thailand. Based on 2,267 customers who used the service of Thonburi Group’s Auto Repair Centres as the population, the sampling of this research was a total of 340 samples, by use of Probability Sampling Technique. Systematic Random Sampling was applied by use of questionnaire in collecting the data at Thonburi Group’s Auto Repair Centres. Mean and Pearson’s basic statistical correlations were utilized in analyzing the data. The study discovered a medium level of customers’ perception towards product and service of Thonburi Group’s Auto Repair Centres, price, place or distribution channel and promotion. People who provided service were perceived also at a medium level, whereas the physical evidence and service process were perceived at a high level. Furthermore, there appeared a correlation between the physical evidence and service process, and customers’ frequency of use of automobile service per year.

Keywords: service marketing mix, behavior, Mercedes Auto Service Centre, frequency of use

Procedia PDF Downloads 302
636 Factors Leading to Teenage Pregnancy in the Selected Villages of Mopani District, in Limpopo Province

Authors: Z. N. Salim, R. T. Lebese, M. S. Maputle

Abstract:

Background: The international community has been concerned about population growth for more than a century. Teenagers in sub-Saharan Africa continue to be at high risk of HIV infection, and this is exacerbated by poverty, whereby many teenagers in Africa come from disadvantaged families/background, which leads them to engage in sexual activities at an early age for survival hence leading to increased rate of teenage pregnancy. Purpose: The study sought to explore, describe and to identify the factors that lead to teenage pregnancy in the selected villages in Mopani District. Design: The study was conducted using a qualitative, explorative, descriptive and contextual approach. A non-probability purposive sampling approach was used. Researcher collected the data with the assistance of research assistant. Participants were interviewed and information was captured on a tape recorder and analysed using open coding and thereafter collected into main themes, themes and sub-themes. The researcher conducted four focus groups, Participants aged between 10-19 years of age. Results: The finding of the study revealed that there are several factors that is contributing to teenagers falling pregnant. Personal, intuitional, and cultural were identified to be the factors leading to teenage pregnancy.

Keywords: factors, leading, pregnancy, teenage

Procedia PDF Downloads 179
635 Assessment of Radiological Dose for Th-232 Laboratory Accumulated in Tropical Freshwater Fish

Authors: Zal U’yun Wan Mahmood, Norfaizal Mohamed, Nita Salina Abu Bakar, Yii Mei Wo, Abdul Kadir Ishak, Mohamad Noh Sawon, Mohd Tarmizi Ishak, Khairul Nizam Razali

Abstract:

The study of thorium radiotracer bioaccumulation in the whole body tropical freshwater fish (Anabas testudeneus; climb pearch) was performed. The objective of this study was to evaluate the effect of different Th-232 activity concentration andradiological dose in Anabas testudeneus under the laboratory bioaccumulation condition. Anabas testudeneus adults were exposed to different waterborne Th-232 levels: 0 BqL-1 (control), 50 BqL-1, 100 BqL-1,150 BqL-1and 200 BqL-1for 30 days. Radionuclide concentration ratios between the whole body levels and water levels were calculated and; total dose rates and risk quotients using ERICA Assessment Tool were also estimated. The results showed the increase of waterborne Th-232 concentration corresponded to a progressive decrease of Th concentration ratio. Meanwhile, the total dose rate (internal and external) in the whole body of Anabas testudeneus less than the ERICA dose rate screening value of 10 µGyh-1 and the risk quotient less than one. Thus, the findings can be concluded that the radiological dose of Th-232 to Anabas testudeneus is a very low probability and the situation may be considered to be of negligible radiological concern.

Keywords: Anabas testudeneus, bioaccumulation, radiological dose, Th-232

Procedia PDF Downloads 299
634 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation

Authors: Sopheak Sorn, Kwok Yip Szeto

Abstract:

Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.

Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio

Procedia PDF Downloads 395
633 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment

Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai

Abstract:

Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.

Keywords: computational methods, MATLAB, seismic hazard, seismic measurements

Procedia PDF Downloads 314
632 Quantum Decision Making with Small Sample for Network Monitoring and Control

Authors: Tatsuya Otoshi, Masayuki Murata

Abstract:

With the development and diversification of applications on the Internet, applications that require high responsiveness, such as video streaming, are becoming mainstream. Application responsiveness is not only a matter of communication delay but also a matter of time required to grasp changes in network conditions. The tradeoff between accuracy and measurement time is a challenge in network control. We people make countless decisions all the time, and our decisions seem to resolve tradeoffs between time and accuracy. When making decisions, people are known to make appropriate choices based on relatively small samples. Although there have been various studies on models of human decision-making, a model that integrates various cognitive biases, called ”quantum decision-making,” has recently attracted much attention. However, the modeling of small samples has not been examined much so far. In this paper, we extend the model of quantum decision-making to model decision-making with a small sample. In the proposed model, the state is updated by value-based probability amplitude amplification. By analytically obtaining a lower bound on the number of samples required for decision-making, we show that decision-making with a small number of samples is feasible.

Keywords: quantum decision making, small sample, MPEG-DASH, Grover's algorithm

Procedia PDF Downloads 54
631 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 428
630 Optimal Allocation of Multiple Emergency Resources for a Single Potential Accident Node: A Mixed Integer Linear Program

Authors: Yongjian Du, Jinhua Sun, Kim M. Liew, Huahua Xiao

Abstract:

Optimal allocation of emergency resources before a disaster is of great importance for emergency response. In reality, the pre-protection for a single critical node where accidents may occur is common. In this study, a model is developed to determine location and inventory decisions of multiple emergency resources among a set of candidate stations to minimize the total cost based on the constraints of budgetary and capacity. The total cost includes the economic accident loss which is accorded with probability distribution of time and the warehousing cost of resources which is increasing over time. A ratio is set to measure the degree of a storage station only serving the target node that becomes larger with the decrease of the distance between them. For the application of linear program, it is assumed that the length of travel time to the accident scene of emergency resources has a linear relationship with the economic accident loss. A computational experiment is conducted to illustrate how the proposed model works, and the results indicate its effectiveness and practicability.

Keywords: emergency response, integer linear program, multiple emergency resources, pre-allocation decisions, single potential accident node

Procedia PDF Downloads 134
629 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard

Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane

Abstract:

This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.

Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard

Procedia PDF Downloads 276
628 Use of Multistage Transition Regression Models for Credit Card Income Prediction

Authors: Denys Osipenko, Jonathan Crook

Abstract:

Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.

Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability

Procedia PDF Downloads 465
627 Real-Time Network Anomaly Detection Systems Based on Machine-Learning Algorithms

Authors: Zahra Ramezanpanah, Joachim Carvallo, Aurelien Rodriguez

Abstract:

This paper aims to detect anomalies in streaming data using machine learning algorithms. In this regard, we designed two separate pipelines and evaluated the effectiveness of each separately. The first pipeline, based on supervised machine learning methods, consists of two phases. In the first phase, we trained several supervised models using the UNSW-NB15 data-set. We measured the efficiency of each using different performance metrics and selected the best model for the second phase. At the beginning of the second phase, we first, using Argus Server, sniffed a local area network. Several types of attacks were simulated and then sent the sniffed data to a running algorithm at short intervals. This algorithm can display the results of each packet of received data in real-time using the trained model. The second pipeline presented in this paper is based on unsupervised algorithms, in which a Temporal Graph Network (TGN) is used to monitor a local network. The TGN is trained to predict the probability of future states of the network based on its past behavior. Our contribution in this section is introducing an indicator to identify anomalies from these predicted probabilities.

Keywords: temporal graph network, anomaly detection, cyber security, IDS

Procedia PDF Downloads 83
626 Risk Based Building Information Modeling (BIM) for Urban Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Building Information Modeling (BIM) is a holistic documentation process for operational visualization, design coordination, estimation and project scheduling. BIM software defines objects parametrically and it is a tool for virtual reality. Primary advantage of implementing BIM is the visual coordination of the building structure and systems such as Mechanical, Electrical and Plumbing (MEP) and it also identifies the possible conflicts between the building systems. This paper is an attempt to develop a risk based BIM model which would highlight the primary advantages of application of BIM pertaining to urban infrastructure transportation project. It has been observed that about 40% of the Architecture, Engineering and Construction (AEC) companies use BIM but primarily for their outsourced projects. Also, 65% of the respondents agree that BIM would be used quiet strongly for future construction projects in India. The 3D models developed with Revit 2015 software would reduce co-ordination problems amongst the architects, structural engineers, contractors and building service providers (MEP). Integration of risk management along with BIM would provide enhanced co-ordination, collaboration and high probability of successful completion of the complex infrastructure transportation project within stipulated time and cost frame.

Keywords: building information modeling (BIM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 286
625 Surgical Site Infections Post Ventriculoperitoneal (VP) Shunting: A Matched Healthcare Cost and Length of Stay Study

Authors: Issa M. Hweidi, Saba W. Al-Ibraheem

Abstract:

This study aimed to assess the increased hospital length of stay and healthcare costs associated with SSIs among ventriculoperitoneal shunting surgery patients in Jordan. This study adopted a retrospective and nested 1:1 matched case-control design. A non-probability convenient sample of 48 VP shunt patients was recruited for the purpose of the study. The targeted groups of the study basically used to cross-match the variables investigated to minimize the risk of confounding. Information was extracted from the text of patients' electronic health records. As compared to the non-SSI group, the SSI group had an extra mean healthcare cost of $13,696.53 (p=0.001) and longer hospital length of stay (22.64 mean additional days). Furthermore, Acinetobacter baumannii and Klebsiella pneumonia were identified as being the most predominant causative agents of SSIs. The results of this study may provide baseline data for national and regional benchmarking to evaluate the quality of care provided to likewise patients. Adherence to infection control strategies and protocols considering new surveillance methods of SSIs is encouraged.

Keywords: ventriculoperitoneal shunt, health care cost, length of stay, neurosurgery, surgical site infections

Procedia PDF Downloads 47
624 Hybridized Simulated Annealing with Chemical Reaction Optimization for Solving to Sequence Alignment Problem

Authors: Ernesto Linan, Linda Cruz, Lucero Becerra

Abstract:

In this paper, a new hybridized algorithm based on Chemical Reaction Optimization and Simulated Annealing is proposed to solve the alignment sequence Problem. The Chemical Reaction Optimization is a population-based meta-heuristic algorithm based on the principles of a chemical reaction. Simulated Annealing is applied to solve a large number of combinatorial optimization problems of general-purpose. In this paper, we propose hybridization between Chemical Reaction Optimization algorithm and Simulated Annealing in order to solve the Sequence Alignment Problem. An initial population of molecules is defined at beginning of the proposed algorithm, where each molecule represents a sequence alignment problem. In order to simulate inter-molecule collisions, the process of Chemical Reaction is placed inside the Metropolis Cycle at certain values of temperature. Inside this cycle, change of molecules is done due to collisions; some molecules are accepted by applying Boltzmann probability. The results with the hybrid scheme are better than the results obtained separately.

Keywords: chemical reaction optimization, sequence alignment problem, simulated annealing algorithm, metaheuristics

Procedia PDF Downloads 188
623 Reexamining Contrarian Trades as a Proxy of Informed Trades: Evidence from China's Stock Market

Authors: Dongqi Sun, Juan Tao, Yingying Wu

Abstract:

This paper reexamines the appropriateness of contrarian trades as a proxy of informed trades, using high frequency Chinese stock data. Employing this measure for 5 minute intervals, a U-shaped intraday pattern of probability of informed trades (PIN) is found for the CSI300 stocks, which is consistent with previous findings for other markets. However, while dividing the trades into different sizes, a reversed U-shaped PIN from large-sized trades, opposed to the U-shaped pattern for small- and medium-sized trades, is observed. Drawing from the mixed evidence with different trade sizes, the price impact of trades is further investigated. By examining the relationship between trade imbalances and unexpected returns, larges-sized trades are found to have significant price impact. This implies that in those intervals with large trades, it is non-contrarian trades that are more likely to be informed trades. Taking account of the price impact of large-sized trades, non-contrarian trades are used to proxy for informed trading in those intervals with large trades, and contrarian trades are still used to measure informed trading in other intervals. A stronger U-shaped PIN is demonstrated from this modification. Auto-correlation and information advantage tests for robustness also support the modified informed trading measure.

Keywords: contrarian trades, informed trading, price impact, trade imbalance

Procedia PDF Downloads 144