Search results for: probability to pass the exam
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1843

Search results for: probability to pass the exam

1513 Semantic Analysis of the Change in Awareness of Korean College Admission Policy

Authors: Sujin Hwang, Hyerang Park, Hyunchul Kim

Abstract:

The purpose of this study is to find the effectiveness of the admission simplification policy. The number of online news articles about ‘high school record’ was collected and semantically analyzed to identify and analyze the social awareness during 2014 to 2015. The main results of the study are as follows: First, there was a difference in expectations that the burden of the examinees would decrease as announced by KCUE. Thus, there was still a strain on the university entrance exam after the enforcement of the policy. Second, private tutoring is expanding in different forms, rather than reducing the policy. It is different from the prediction that examinees can prepare for university admissions without the private tutoring. Thus, the college admission rules currently enforced needs to be improved. The reasonable college admission system changes are discussed.

Keywords: education policy, private tutoring, shadow education, education admission policy

Procedia PDF Downloads 227
1512 QSAR Modeling of Germination Activity of a Series of 5-(4-Substituent-Phenoxy)-3-Methylfuran-2(5H)-One Derivatives with Potential of Strigolactone Mimics toward Striga hermonthica

Authors: Strahinja Kovačević, Sanja Podunavac-Kuzmanović, Lidija Jevrić, Cristina Prandi, Piermichele Kobauri

Abstract:

The present study is based on molecular modeling of a series of twelve 5-(4-substituent-phenoxy)-3-methylfuran-2(5H)-one derivatives which have potential of strigolactones mimics toward Striga hermonthica. The first step of the analysis included the calculation of molecular descriptors which numerically describe the structures of the analyzed compounds. The descriptors ALOGP (lipophilicity), AClogS (water solubility) and BBB (blood-brain barrier penetration), served as the input variables in multiple linear regression (MLR) modeling of germination activity toward S. hermonthica. Two MLR models were obtained. The first MLR model contains ALOGP and AClogS descriptors, while the second one is based on these two descriptors plus BBB descriptor. Despite the braking Topliss-Costello rule in the second MLR model, it has much better statistical and cross-validation characteristics than the first one. The ALOGP and AClogS descriptors are often very suitable predictors of the biological activity of many compounds. They are very important descriptors of the biological behavior and availability of a compound in any biological system (i.e. the ability to pass through the cell membranes). BBB descriptor defines the ability of a molecule to pass through the blood-brain barrier. Besides the lipophilicity of a compound, this descriptor carries the information of the molecular bulkiness (its value strongly depends on molecular bulkiness). According to the obtained results of MLR modeling, these three descriptors are considered as very good predictors of germination activity of the analyzed compounds toward S. hermonthica seeds. This article is based upon work from COST Action (FA1206), supported by COST (European Cooperation in Science and Technology).

Keywords: chemometrics, germination activity, molecular modeling, QSAR analysis, strigolactones

Procedia PDF Downloads 287
1511 Human Resource Utilization Models for Graceful Ageing

Authors: Chuang-Chun Chiou

Abstract:

In this study, a systematic framework of graceful ageing has been used to explore the possible human resource utilization models for graceful ageing purpose. This framework is based on the Chinese culture. We call ‘Nine-old’ target. They are ageing gracefully with feeding, accomplishment, usefulness, learning, entertainment, care, protection, dignity, and termination. This study is focused on two areas: accomplishment and usefulness. We exam the current practices of initiatives and laws of promoting labor participation. That is to focus on how to increase Labor Force Participation Rate of the middle aged as well as the elderly and try to promote the elderly to achieve graceful ageing. Then we present the possible models that support graceful ageing.

Keywords: human resource utilization model, labor participation, graceful ageing, employment

Procedia PDF Downloads 390
1510 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 175
1509 Comparison of Quality of Life One Year after Bariatric Intervention: Systematic Review of the Literature with Bayesian Network Meta-Analysis

Authors: Piotr Tylec, Alicja Dudek, Grzegorz Torbicz, Magdalena Mizera, Natalia Gajewska, Michael Su, Tanawat Vongsurbchart, Tomasz Stefura, Magdalena Pisarska, Mateusz Rubinkiewicz, Piotr Malczak, Piotr Major, Michal Pedziwiatr

Abstract:

Introduction: Quality of life after bariatric surgery is an important factor when evaluating the final result of the treatment. Considering the vast surgical options, we tried to globally compare available methods in terms of quality of following the surgery. The aim of the study is to compare the quality of life a year after bariatric intervention using network meta-analysis methods. Material and Methods: We performed a systematic review according to PRISMA guidelines with Bayesian network meta-analysis. Inclusion criteria were: studies comparing at least two methods of weight loss treatment of which at least one is surgical, assessment of the quality of life one year after surgery by validated questionnaires. Primary outcomes were quality of life one year after bariatric procedure. The following aspects of quality of life were analyzed: physical, emotional, general health, vitality, role physical, social, mental, and bodily pain. All questionnaires were standardized and pooled to a single scale. Lifestyle intervention was considered as a referenced point. Results: An initial reference search yielded 5636 articles. 18 studies were evaluated. In comparison of total score of quality of life, we observed that laparoscopic sleeve gastrectomy (LSG) (median (M): 3.606, Credible Interval 97.5% (CrI): 1.039; 6.191), laparoscopic Roux en-Y gastric by-pass (LRYGB) (M: 4.973, CrI: 2.627; 7.317) and open Roux en-Y gastric by-pass (RYGB) (M: 9.735, CrI: 6.708; 12.760) had better results than other bariatric intervention in relation to lifestyle interventions. In the analysis of the physical aspects of quality of life, we notice better results in LSG (M: 3.348, CrI: 0.548; 6.147) and in LRYGB procedure (M: 5.070, CrI: 2.896; 7.208) than control intervention, and worst results in open RYGB (M: -9.212, CrI: -11.610; -6.844). Analyzing emotional aspects, we found better results than control intervention in LSG, in LRYGB, in open RYGB, and laparoscopic gastric plication. In general health better results were in LSG (M: 9.144, CrI: 4.704; 13.470), in LRYGB (M: 6.451, CrI: 10.240; 13.830) and in single-anastomosis gastric by-pass (M: 8.671, CrI: 1.986; 15.310), and worst results in open RYGB (M: -4.048, CrI: -7.984; -0.305). In social and vital aspects of quality of life, better results were observed in LSG and LRYGB than control intervention. We did not find any differences between bariatric interventions in physical role, mental and bodily aspects of quality of life. Conclusion: The network meta-analysis revealed that better quality of life in total score one year after bariatric interventions were after LSG, LRYGB, open RYGB. In physical and general health aspects worst quality of life was in open RYGB procedure. Other interventions did not significantly affect the quality of life after a year compared to dietary intervention.

Keywords: bariatric surgery, network meta-analysis, quality of life, one year follow-up

Procedia PDF Downloads 159
1508 Nebulized Magnesium Sulfate in Acute Moderate to Severe Asthma in Pediatric Patients

Authors: Lubna M. Zakaryia Mahmoud, Mohammed A. Dawood, Doaa A. Heiba

Abstract:

A prospective double-blind placebo controlled trial carried out on 60 children known to be asthmatic who presented to the emergency department at Alexandria University of Children’s Hospital at El-Shatby with acute asthma exacerbations to assess the efficacy of adding inhaled magnesium sulfate to β-agonist, compared with β-agonist in saline, in the management of acute asthma exacerbations in children. The participants in the study were divided in two groups; Group A (study group) received inhaled salbutamol solution (0.15 ml/kg) plus isotonic magnesium sulfate 2 ml in a nebulizer chamber. Group B (control group): received nebulized salbutamol solution (0.15 ml/kg) diluted with placebo (2 ml normal saline). Both groups received inhaled solution every 20 minutes that was repeated for three doses. They were evaluated using the Pediatric Asthma Severity Score (PASS), oxygen saturation using portable pulse oximetry and peak expiratory flow rate using a portable peak expiratory flow meter at initially recorded as zero-minute assessment and every 20 minutes from the end of each nebulization (nebulization lasts 5-10 minutes) recorded as 20, 40 and 60-minute assessments. Regarding PASS, comparison showed non-significant difference with p-value 0.463, 0.472, 0.0766 at 20, 40 and 60 minutes. Regarding oxygen saturation, improvement was more significant towards group A starting from 40 min with significant p-value=0.000. At 60 min p-value=0.000. Although mean PEFR significantly improved from zero-min in both groups; however, improvement was more significant in group A with significant p-value = 0.015, 0.001, 0.001 at 20 min, 40 min and 60 min, respectively. The conclusion this study suggests is that inhaled magnesium sulfate is an efficient add on drug to standard β- agonist inhalation used in the treatment of moderate to severe asthma exacerbations.

Keywords: nebulized, magnesium sulfate, acute asthma , pediatric

Procedia PDF Downloads 183
1507 Organizational Innovations of the 20th Century as High Tech of the 21st: Evidence from Patent Data

Authors: Valery Yakubovich, Shuping wu

Abstract:

Organization theorists have long claimed that organizational innovations are nontechnological, in part because they are unpatentable. The claim rests on the assumption that organizational innovations are abstract ideas embodied in persons and contexts rather than in context-free practical tools. However, over the last three decades, organizational knowledge has been increasingly embodied in digital tools which, in principle, can be patented. To provide the first empirical evidence regarding the patentability of organizational innovations, we trained two machine learning algorithms to identify a population of 205,434 patent applications for organizational technologies (OrgTech) and, among them, 141,285 applications that use organizational innovations accumulated over the 20th century. Our event history analysis of the probability of patenting an OrgTech invention shows that ideas from organizational innovations decrease the probability of patent allowance unless they describe a practical tool. We conclude that the present-day digital transformation places organizational innovations in the realm of high tech and turns the debate about organizational technologies into the challenge of designing practical organizational tools that embody big ideas about organizing. We outline an agenda for patent-based research on OrgTech as an emerging phenomenon.

Keywords: organizational innovation, organizational technology, high tech, patents, machine learning

Procedia PDF Downloads 122
1506 Determinants of Income Diversification among Support Zone Communities of National Parks in Nigeria

Authors: Daniel Etim Jacob, Samuel Onadeko, Edem A. Eniang, Imaobong Ufot Nelson

Abstract:

This paper examined determinants of income diversification among households in support zones communities of national parks in Nigeria. This involved the use household data collected through questionnaires administered randomly among 1009 household heads in the study area. The data obtained were analyzed using probability and non-probability statistical analysis such as regression and analysis of variance to test for mean difference between parks. The result obtained indicates that majority of the household heads were male (92.57%0, between the age class of 21 – 40 years (44.90%), had non-formal education (38.16%), were farmers (65.21%), owned land (95.44%), with a household size of 1 – 5 (36.67%) and an annual income range of ₦401,000 - ₦600,000 (24.58%). Mean Simpson index of diversity showed a general low (0.375) level of income diversification among the households. Income, age, off-farm dependence, education, household size and occupation where significant (p<0.01) factors that affected households’ income diversification. The study recommends improvement in the existing infrastructures and social capital in the communities as avenues to improve the livelihood and ensure positive conservation behaviors in the study area.

Keywords: income diversification, protected area, livelihood, poverty, Nigeria

Procedia PDF Downloads 143
1505 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks

Authors: Ahmed Abdullah Ahmed

Abstract:

The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.

Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments

Procedia PDF Downloads 512
1504 Real-World Comparison of Adherence to and Persistence with Dulaglutide and Liraglutide in UAE e-Claims Database

Authors: Ibrahim Turfanda, Soniya Rai, Karan Vadher

Abstract:

Objectives— The study aims to compare real-world adherence to and persistence with dulaglutide and liraglutide in patients with type 2 diabetes (T2D) initiating treatment in UAE. Methods— This was a retrospective, non-interventional study (observation period: 01 March 2017–31 August 2019) using the UAE Dubai e-Claims database. Included: adult patients initiating dulaglutide/liraglutide 01 September 2017–31 August 2018 (index period) with: ≥1 claim for T2D in the 6 months before index date (ID); ≥1 claim for dulaglutide/liraglutide during index period; and continuous medical enrolment for ≥6 months before and ≥12 months after ID. Key endpoints, assessed 3/6/12 months after ID: adherence to treatment (proportion of days covered [PDC; PDC ≥80% considered ‘adherent’], per-group mean±standard deviation [SD] PDC); and persistence (number of continuous therapy days from ID until discontinuation [i.e., >45 days gap] or end of observation period). Patients initiating dulaglutide/liraglutide were propensity score matched (1:1) based on baseline characteristics. Between-group comparison of adherence was analysed using the McNemar test (α=0.025). Persistence was analysed using Kaplan–Meier estimates with log-rank tests (α=0.025) for between-group comparisons. This study presents 12-month outcomes. Results— Following propensity score matching, 263 patients were included in each group. Mean±SD PDC for all patients at 12 months was significantly higher in the dulaglutide versus the liraglutide group (dulaglutide=0.48±0.30, liraglutide=0.39±0.28, p=0.0002). The proportion of adherent patients favored dulaglutide (dulaglutide=20.2%, liraglutide=12.9%, p=0.0302), as did the probability of being adherent to treatment (odds ratio [97.5% CI]: 1.70 [0.99, 2.91]; p=0.03). Proportion of persistent patients also favoured dulaglutide (dulaglutide=15.2%, liraglutide=9.1%, p=0.0528), as did the probability of discontinuing treatment 12 months after ID (p=0.027). Conclusions— Based on the UAE Dubai e-Claims database data, dulaglutide initiators exhibited significantly greater adherence in terms of mean PDC versus liraglutide initiators. The proportion of adherent patients and the probability of being adherent favored the dulaglutide group, as did treatment persistence.

Keywords: adherence, dulaglutide, effectiveness, liraglutide, persistence

Procedia PDF Downloads 126
1503 Physical Exam-Indicated Cerclage with Mesh Cap Prolonged Gestation on Average for 9 Weeks and 4 Days: 11 Years of Experience

Authors: M. Keršič, M. Lužnik, J. Lužnik

Abstract:

Cervical dilatation and membrane herniation before 26th week of gestation poses very high risk for extremely and very premature childbirth. Cerclage with mesh cap (mesh cerclage, MC) can greatly diminish this risk and provide additional positive effects. Between 2005 and 2014, MC has been performed in 9 patients with singleton pregnancies who had prolapsed membranes beyond external cervical/uterine os before 25th week of pregnancy (one in 29th). With patients in general anaesthesia, lithotomy and Trendelenburg position (about 25°) prolapsed membranes were repositioned in the uterine cavity, using tampon soaked in antiseptic solution (Skinsept mucosa). A circular, a type of purse-string suture (main band) with double string Ethilon 1 was applied at about 1 to 1.5 cm from the border of the external uterine os - 6 to 8 stitches were placed, so the whole external uterine os was encircled (modified McDonald). In the next step additional Ethilon 0 sutures were placed around all exposed parts of the main double circular suture and loosely tightened. On those sutures, round tailored (diameter around 6 cm) mesh (Prolene® or Gynemesh* PS) was attached. In all 9 cases, gestation was prolonged on average for 9 weeks and 4 days (67 days). In four cases maturity was achieved. Mesh was removed in 37th–38th week of pregnancy or if spontaneous labour began. In two cases, a caesarean section was performed because of breech presentation. In the first week after birth in 22nd week one new born died because of immaturity (premature birth was threatening in 18th week and then MC was placed). Ten years after first MC, 8 of 9 women with singleton pregnancy and MC performed have 8 healthy children from these pregnancies. Mesh cerclage successfully closed the opened cervical canal or uterine orifice and prevented further membrane herniation and membrane rupture. MC also provides a similar effect as with occluding the external os with suturing but without interrupting the way for excretion of abundant cervical mucus. The mesh also pulls the main circular band outwards and thus lowers the chance of suture cutting through the remaining cervix. MC prolonged gestation very successfully (mean for 9 weeks and 4 days) and thus increased possibility for survival and diminished the risk for complications in very early preterm delivered survivors in cases with cervical dilatation and membrane herniation before 26th week of gestation. Without action possibility to achieve at least 28th or 32nd week of gestation would be poor.

Keywords: cervical insufficiency, mesh cerclage, membrane protrusion, premature birth prevention, physical exam-indicated cerclage, rescue cerclage

Procedia PDF Downloads 190
1502 Numerical Studies on Bypass Thrust Augmentation Using Convective Heat Transfer in Turbofan Engine

Authors: R. Adwaith, J. Gopinath, Vasantha Kohila B., R. Chandru, Arul Prakash R.

Abstract:

The turbofan engine is a type of air breathing engine that is widely used in aircraft propulsion produces thrust mainly from the mass-flow of air bypassing the engine core. The present research has developed an effective method numerically by increasing the thrust generated from the bypass air. This thrust increase is brought about by heating the walls of the bypass valve from the combustion chamber using convective heat transfer method. It is achieved computationally by the use external heat to enhance the velocity of bypass air of turbofan engines. The bypass valves are either heated externally using multicell tube resistor which convert electricity generated by dynamos into heat or heat is transferred from the combustion chamber. This increases the temperature of the flow in the valves and thereby increase the velocity of the flow that enters the nozzle of the engine. As a result, mass-flow of air passing the core engine for producing more thrust can be significantly reduced thereby saving considerable amount of Jet fuel. Numerical analysis has been carried out on a scaled down version of a typical turbofan bypass valve, where the valve wall temperature has been increased to 700 Kelvin. It is observed from the analysis that, the exit velocity contributing to thrust has significantly increased by 10 % due to the heating of by-pass valve. The degree of optimum increase in the temperature, and the corresponding effect in the increase of jet velocity is calculated to determine the operating temperature range for efficient increase in velocity. The technique used in the research increases the thrust by using heated by-pass air without extracting much work from the fuel and thus improve the efficiency of existing turbofan engines. Dimensional analysis has been carried to prove the accuracy of the results obtained numerically.

Keywords: turbofan engine, bypass valve, multi-cell tube, convective heat transfer, thrust

Procedia PDF Downloads 358
1501 Formulation of Famotidine Solid Lipid Nanoparticles (SLN): Preparation, Evaluation and Release Study

Authors: Rachmat Mauludin, Nurmazidah

Abstract:

Background and purpose: Famotidine is an H2 receptor blocker. Absorption orally is rapid enough, but famotidine can be degraded by stomach acid causing dose reduction until 35.8% after 50 minutes. This drug also undergoes first-pass metabolism which reduced its bio availability only until 40-50%. To overcome these problems, Solid Lipid Nano particles (SLNs) as alternative delivery systems can be formulated. SLNs is a lipid-based drug delivery technology with 50-1000 nm particle size, where the drug incorporated into the bio compatible lipids and the lipid particles are stabilized using appropriate stabilizers. When the particle size is 200 nm or below, lipid containing famotidine can be absorbed through the lymphatic vessels to the subclavian vein, so first-pass metabolism can be avoided. Method: Famotidine SLNs with various compositions of stabilizer was prepared using a high-speed homogenization and sonication method. Then, the particle size distribution, zeta potential, entrapment efficiency, particle morphology and in vitro release profiles were evaluated. Optimization of sonication time also carried out. Result: Particle size of SLN by Particle Size Analyzer was in range 114.6 up to 455.267 nm. Ultrasonicated SLNs within 5 minutes generated smaller particle size than SLNs which was ultrasonicated for 10 and 15 minutes. Entrapment efficiency of SLNs were 74.17 up to 79.45%. Particle morphology of the SLNs was spherical and distributed individually. Release study of Famotidine revealed that in acid medium, 28.89 up to 80.55% of famotidine could be released after 2 hours. Nevertheless in basic medium, famotidine was released 40.5 up to 86.88% in the same period. Conclusion: The best formula was SLNs which stabilized by 4% Poloxamer 188 and 1 % Span 20, that had particle size 114.6 nm in diameter, 77.14% famotidine entrapped, and the particle morphology was spherical and distributed individually. SLNs with the best drug release profile was SLNs which stabilized by 4% Eudragit L 100-55 and 1% Tween 80 which had released 36.34 % in pH 1.2 solution, and 74.13% in pH 7.4 solution after 2 hours. The optimum sonication time was 5 minutes.

Keywords: famotodine, SLN, high speed homogenization, particle size, release study

Procedia PDF Downloads 860
1500 Complementing Assessment Processes with Standardized Tests: A Work in Progress

Authors: Amparo Camacho

Abstract:

ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.

Keywords: assessment, hard skills, soft skills, standardized tests

Procedia PDF Downloads 284
1499 Modal Approach for Decoupling Damage Cost Dependencies in Building Stories

Authors: Haj Najafi Leila, Tehranizadeh Mohsen

Abstract:

Dependencies between diverse factors involved in probabilistic seismic loss evaluation are recognized to be an imperative issue in acquiring accurate loss estimates. Dependencies among component damage costs could be taken into account considering two partial distinct states of independent or perfectly-dependent for component damage states; however, in our best knowledge, there is no available procedure to take account of loss dependencies in story level. This paper attempts to present a method called "modal cost superposition method" for decoupling story damage costs subjected to earthquake ground motions dealt with closed form differential equations between damage cost and engineering demand parameters which should be solved in complex system considering all stories' cost equations by the means of the introduced "substituted matrixes of mass and stiffness". Costs are treated as probabilistic variables with definite statistic factors of median and standard deviation amounts and a presumed probability distribution. To supplement the proposed procedure and also to display straightforwardness of its application, one benchmark study has been conducted. Acceptable compatibility has been proven for the estimated damage costs evaluated by the new proposed modal and also frequently used stochastic approaches for entire building; however, in story level, insufficiency of employing modification factor for incorporating occurrence probability dependencies between stories has been revealed due to discrepant amounts of dependency between damage costs of different stories. Also, more dependency contribution in occurrence probability of loss could be concluded regarding more compatibility of loss results in higher stories than the lower ones, whereas reduction in incorporation portion of cost modes provides acceptable level of accuracy and gets away from time consuming calculations including some limited number of cost modes in high mode situation.

Keywords: dependency, story-cost, cost modes, engineering demand parameter

Procedia PDF Downloads 180
1498 A Theoretical Approach on Electoral Competition, Lobby Formation and Equilibrium Policy Platforms

Authors: Deepti Kohli, Meeta Keswani Mehra

Abstract:

The paper develops a theoretical model of electoral competition with purely opportunistic candidates and a uni-dimensional policy using the probability voting approach while focusing on the aspect of lobby formation to analyze the inherent complex interactions between centripetal and centrifugal forces and their effects on equilibrium policy platforms. There exist three types of agents, namely, Left-wing, Moderate and Right-wing who comprise of the total voting population. Also, it is assumed that the Left and Right agents are free to initiate a lobby of their choice. If initiated, these lobbies generate donations which in turn can be contributed to one (or both) electoral candidates in order to influence them to implement the lobby’s preferred policy. Four different lobby formation scenarios have been considered: no lobby formation, only Left, only Right and both Left and Right. The equilibrium policy platforms, amount of individual donations by agents to their respective lobbies and the contributions offered to the electoral candidates have been solved for under each of the above four cases. Since it is assumed that the agents cannot coordinate each other’s actions during the lobby formation stage, there exists a probability with which a lobby would be formed, which is also solved for in the model. The results indicate that the policy platforms of the two electoral candidates converge completely under the cases of no lobby and both (extreme) formations but diverge under the cases of only one (Left or Right) lobby formation. This is because in the case of no lobby being formed, only the centripetal forces (emerging from the election-winning aspect) are present while in the case of both extreme (Left-wing and Right-wing) lobbies being formed, centrifugal forces (emerging from the lobby formation aspect) also arise but cancel each other out, again resulting in a pure policy convergence phenomenon. In contrast, in case of only one lobby being formed, both centripetal and centrifugal forces interact strategically, leading the two electoral candidates to choose completely different policy platforms in equilibrium. Additionally, it is found that in equilibrium, while the donation by a specific agent type increases with the formation of both lobbies in comparison to when only one lobby is formed, the probability of implementation of the policy being advocated by that lobby group falls.

Keywords: electoral competition, equilibrium policy platforms, lobby formation, opportunistic candidates

Procedia PDF Downloads 333
1497 Reliability Analysis of Glass Epoxy Composite Plate under Low Velocity

Authors: Shivdayal Patel, Suhail Ahmad

Abstract:

Safety assurance and failure prediction of composite material component of an offshore structure due to low velocity impact is essential for associated risk assessment. It is important to incorporate uncertainties associated with material properties and load due to an impact. Likelihood of this hazard causing a chain of failure events plays an important role in risk assessment. The material properties of composites mostly exhibit a scatter due to their in-homogeneity and anisotropic characteristics, brittleness of the matrix and fiber and manufacturing defects. In fact, the probability of occurrence of such a scenario is due to large uncertainties arising in the system. Probabilistic finite element analysis of composite plates due to low-velocity impact is carried out considering uncertainties of material properties and initial impact velocity. Impact-induced damage of composite plate is a probabilistic phenomenon due to a wide range of uncertainties arising in material and loading behavior. A typical failure crack initiates and propagates further into the interface causing de-lamination between dissimilar plies. Since individual crack in the ply is difficult to track. The progressive damage model is implemented in the FE code by a user-defined material subroutine (VUMAT) to overcome these problems. The limit state function is accordingly established while the stresses in the lamina are such that the limit state function (g(x)>0). The Gaussian process response surface method is presently adopted to determine the probability of failure. A comparative study is also carried out for different combination of impactor masses and velocities. The sensitivity based probabilistic design optimization procedure is investigated to achieve better strength and lighter weight of composite structures. Chain of failure events due to different modes of failure is considered to estimate the consequences of failure scenario. Frequencies of occurrence of specific impact hazards yield the expected risk due to economic loss.

Keywords: composites, damage propagation, low velocity impact, probability of failure, uncertainty modeling

Procedia PDF Downloads 279
1496 Automatic Method for Classification of Informative and Noninformative Images in Colonoscopy Video

Authors: Nidhal K. Azawi, John M. Gauch

Abstract:

Colorectal cancer is one of the leading causes of cancer death in the US and the world, which is why millions of colonoscopy examinations are performed annually. Unfortunately, noise, specular highlights, and motion artifacts corrupt many images in a typical colonoscopy exam. The goal of our research is to produce automated techniques to detect and correct or remove these noninformative images from colonoscopy videos, so physicians can focus their attention on informative images. In this research, we first automatically extract features from images. Then we use machine learning and deep neural network to classify colonoscopy images as either informative or noninformative. Our results show that we achieve image classification accuracy between 92-98%. We also show how the removal of noninformative images together with image alignment can aid in the creation of image panoramas and other visualizations of colonoscopy images.

Keywords: colonoscopy classification, feature extraction, image alignment, machine learning

Procedia PDF Downloads 253
1495 Detecting Logical Errors in Haskell

Authors: Vanessa Vasconcelos, Mariza A. S. Bigonha

Abstract:

In order to facilitate both processes, this paper presents HaskellFL, a tool that uses fault localization techniques to locate a logical error in Haskell code. The Haskell subset used in this work is sufficiently expressive for those studying functional programming to get immediate help debugging their code and to answer questions about key concepts associated with the functional paradigm. HaskellFL was tested against functional programming assignments submitted by students enrolled at the functional programming class at the Federal University of Minas Gerais and against exercises from the Exercism Haskell track that are publicly available on GitHub. Furthermore, the EXAM score was chosen to evaluate the tool’s effectiveness, and results showed that HaskellFL reduced the effort needed to locate an error for all tested scenarios. Results also showed that the Ochiai method was more effective than Tarantula.

Keywords: debug, fault localization, functional programming, Haskell

Procedia PDF Downloads 298
1494 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models

Authors: Ramin Vafadary, Maryam Khanbaghi

Abstract:

Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.

Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series

Procedia PDF Downloads 95
1493 Numerical Simulation of a Single Cell Passing through a Narrow Slit

Authors: Lanlan Xiao, Yang Liu, Shuo Chen, Bingmei Fu

Abstract:

Most cancer-related deaths are due to metastasis. Metastasis is a complex, multistep processes including the detachment of cancer cells from the primary tumor and the migration to distant targeted organs through blood and/or lymphatic circulations. During hematogenous metastasis, the emigration of tumor cells from the blood stream through the vascular wall into the tissue involves arrest in the microvasculature, adhesion to the endothelial cells forming the microvessel wall and transmigration to the tissue through the endothelial barrier termed as extravasation. The narrow slit between endothelial cells that line the microvessel wall is the principal pathway for tumor cell extravasation to the surrounding tissue. To understand this crucial step for tumor hematogenous metastasis, we used Dissipative Particle Dynamics method to investigate an individual cell passing through a narrow slit numerically. The cell membrane was simulated by a spring-based network model which can separate the internal cytoplasm and surrounding fluid. The effects of the cell elasticity, cell shape and cell surface area increase, and slit size on the cell transmigration through the slit were investigated. Under a fixed driven force, the cell with higher elasticity can be elongated more and pass faster through the slit. When the slit width decreases to 2/3 of the cell diameter, the spherical cell becomes jammed despite reducing its elasticity modulus by 10 times. However, transforming the cell from a spherical to ellipsoidal shape and increasing the cell surface area only by 3% can enable the cell to pass the narrow slit. Therefore the cell shape and surface area increase play a more important role than the cell elasticity in cell passing through the narrow slit. In addition, the simulation results indicate that the cell migration velocity decreases during entry but increases during exit of the slit, which is qualitatively in agreement with the experimental observation.

Keywords: dissipative particle dynamics, deformability, surface area increase, cell migration

Procedia PDF Downloads 334
1492 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object

Procedia PDF Downloads 233
1491 The End Justifies the Means: Using Programmed Mastery Drill to Teach Spoken English to Spanish Youngsters, without Relying on Homework

Authors: Robert Pocklington

Abstract:

Most current language courses expect students to be ‘vocational’, sacrificing their free time in order to learn. However, pupils with a full-time job, or bringing up children, hardly have a spare moment. Others just need the language as a tool or a qualification, as if it were book-keeping or a driving license. Then there are children in unstructured families whose stressful life makes private study almost impossible. And the countless parents whose evenings and weekends have become a nightmare, trying to get the children to do their homework. There are many arguments against homework being a necessity (rather than an optional extra for more ambitious or dedicated students), making a clear case for teaching methods which facilitate full learning of the key content within the classroom. A methodology which could be described as Programmed Mastery Learning has been used at Fluency Language Academy (Spain) since 1992, to teach English to over 4000 pupils yearly, with a staff of around 100 teachers, barely requiring homework. The course is structured according to the tenets of Programmed Learning: small manageable teaching steps, immediate feedback, and constant successful activity. For the Mastery component (not stopping until everyone has learned), the memorisation and practice are entrusted to flashcard-based drilling in the classroom, leading all students to progress together and develop a permanently growing knowledge base. Vocabulary and expressions are memorised using flashcards as stimuli, obliging the brain to constantly recover words from the long-term memory and converting them into reflex knowledge, before they are deployed in sentence building. The use of grammar rules is practised with ‘cue’ flashcards: the brain refers consciously to the grammar rule each time it produces a phrase until it comes easily. This automation of lexicon and correct grammar use greatly facilitates all other language and conversational activities. The full B2 course consists of 48 units each of which takes a class an average of 17,5 hours to complete, allowing the vast majority of students to reach B2 level in 840 class hours, which is corroborated by an 85% pass-rate in the Cambridge University B2 exam (First Certificate). In the past, studying for qualifications was just one of many different options open to young people. Nowadays, youngsters need to stay at school and obtain qualifications in order to get any kind of job. There are many students in our classes who have little intrinsic interest in what they are studying; they just need the certificate. In these circumstances and with increasing government pressure to minimise failure, teachers can no longer think ‘If they don’t study, and fail, its their problem’. It is now becoming the teacher’s problem. Teachers are ever more in need of methods which make their pupils successful learners; this means assuring learning in the classroom. Furthermore, homework is arguably the main divider between successful middle-class schoolchildren and failing working-class children who drop out: if everything important is learned at school, the latter will have a much better chance, favouring inclusiveness in the language classroom.

Keywords: flashcard drilling, fluency method, mastery learning, programmed learning, teaching English as a foreign language

Procedia PDF Downloads 110
1490 Risk Assessment of Flood Defences by Utilising Condition Grade Based Probabilistic Approach

Authors: M. Bahari Mehrabani, Hua-Peng Chen

Abstract:

Management and maintenance of coastal defence structures during the expected life cycle have become a real challenge for decision makers and engineers. Accurate evaluation of the current condition and future performance of flood defence structures is essential for effective practical maintenance strategies on the basis of available field inspection data. Moreover, as coastal defence structures age, it becomes more challenging to implement maintenance and management plans to avoid structural failure. Therefore, condition inspection data are essential for assessing damage and forecasting deterioration of ageing flood defence structures in order to keep the structures in an acceptable condition. The inspection data for flood defence structures are often collected using discrete visual condition rating schemes. In order to evaluate future condition of the structure, a probabilistic deterioration model needs to be utilised. However, existing deterioration models may not provide a reliable prediction of performance deterioration for a long period due to uncertainties. To tackle the limitation, a time-dependent condition-based model associated with a transition probability needs to be developed on the basis of condition grade scheme for flood defences. This paper presents a probabilistic method for predicting future performance deterioration of coastal flood defence structures based on condition grading inspection data and deterioration curves estimated by expert judgement. In condition-based deterioration modelling, the main task is to estimate transition probability matrices. The deterioration process of the structure related to the transition states is modelled according to Markov chain process, and a reliability-based approach is used to estimate the probability of structural failure. Visual inspection data according to the United Kingdom Condition Assessment Manual are used to obtain the initial condition grade curve of the coastal flood defences. The initial curves then modified in order to develop transition probabilities through non-linear regression based optimisation algorithms. The Monte Carlo simulations are then used to evaluate the future performance of the structure on the basis of the estimated transition probabilities. Finally, a case study is given to demonstrate the applicability of the proposed method under no-maintenance and medium-maintenance scenarios. Results show that the proposed method can provide an effective predictive model for various situations in terms of available condition grading data. The proposed model also provides useful information on time-dependent probability of failure in coastal flood defences.

Keywords: condition grading, flood defense, performance assessment, stochastic deterioration modelling

Procedia PDF Downloads 234
1489 Implicit Transaction Costs and the Fundamental Theorems of Asset Pricing

Authors: Erindi Allaj

Abstract:

This paper studies arbitrage pricing theory in financial markets with transaction costs. We extend the existing theory to include the more realistic possibility that the price at which the investors trade is dependent on the traded volume. The investors in the market always buy at the ask and sell at the bid price. Transaction costs are composed of two terms, one is able to capture the implicit transaction costs and the other the price impact. Moreover, a new definition of a self-financing portfolio is obtained. The self-financing condition suggests that continuous trading is possible, but is restricted to predictable trading strategies which have left and right limit and finite quadratic variation. That is, predictable trading strategies of infinite variation and of finite quadratic variation are allowed in our setting. Within this framework, the existence of an equivalent probability measure is equivalent to the absence of arbitrage opportunities, so that the first fundamental theorem of asset pricing (FFTAP) holds. It is also proved that, when this probability measure is unique, any contingent claim in the market is hedgeable in an L2-sense. The price of any contingent claim is equal to the risk-neutral price. To better understand how to apply the theory proposed we provide an example with linear transaction costs.

Keywords: arbitrage pricing theory, transaction costs, fundamental theorems of arbitrage, financial markets

Procedia PDF Downloads 360
1488 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 201
1487 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 382
1486 Assortative Education and Working Arrangement among Married Couples in Indonesia

Authors: Ratu Khabiba, Qisha Quarina

Abstract:

This study aims to analyse the effect of married couples’ assortative educational attainments on the division of economic activities among themselves in the household. This study contributes to the literature on women’s participation in employment, especially among married women, to see whether the traditional values about gender roles in the household still continue to shape the employment participation among married women in Indonesia, despite increasing women’s human capital through education. This study utilizes the Indonesian National Socioeconomic Survey (SUSENAS) 2016 and estimates the results using the multinomial logit model. Our results show that compared to high-educated educational homogamy couples, educational heterogamy couples, especially hypergamy, have a higher probability of being a single-worker type. Moreover, the high-educated educational homogamy couples have the highest probability of being a dual-worker type. Thus, we found evidence that the traditional values of gender role segregation seem to still play a significant role in married women’s employment decision in Indonesia, particularly for couples’ with educational heterogamy and low-educated educational homogamy couples.

Keywords: assortative education, dual-worker, hypergamy, homogamy, traditional values, women labor participation

Procedia PDF Downloads 118
1485 How to Guide Students from Surface to Deep Learning: Applied Philosophy in Management Education

Authors: Lihong Wu, Raymond Young

Abstract:

The ability to learn is one of the most critical skills in the information age. However, many students do not have a clear understanding of what learning is, what they are learning, and why they are learning. Many students study simply to pass rather than to learn something useful for their career and their life. They have a misconception about learning and a wrong attitude towards learning. This research explores student attitudes to study in management education and explores how to intercede to lead students from shallow to deeper modes of learning.

Keywords: knowledge, surface learning, deep learning, education

Procedia PDF Downloads 501
1484 Low-Power Digital Filters Design Using a Bypassing Technique

Authors: Thiago Brito Bezerra

Abstract:

This paper presents a novel approach to reduce power consumption of digital filters based on dynamic bypassing of partial products in their multipliers. The bypassing elements incorporated into the multiplier hardware eliminate redundant signal transitions, which appear within the carry-save adders when the partial product is zero. This technique reduces the power consumption by around 20%. The circuit implementation was made using the AMS 0.18 um technology. The bypassing technique applied to the circuits is outlined.

Keywords: digital filter, low-power, bypassing technique, low-pass filter

Procedia PDF Downloads 382