Search results for: counting with uncertainties
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 619

Search results for: counting with uncertainties

379 Development of an Index for Asset Class in Ex-Ante Portfolio Management

Authors: Miang Hong Ngerng, Noor Diyana Jasme, May Jin Theong

Abstract:

Volatile market environment is inevitable. Fund managers are struggling to choose the right strategy to survive and overcome uncertainties and adverse market movement. Therefore, finding certainty in the mist of uncertainty future is one of the key performance objectives for fund managers. Current available theoretical results are not practical due to strong reliance on the investment assumption made. This paper is to identify the component that can be forecasted in Ex-ante setting which is the realistic situation facing a fund manager in the actual execution of asset allocation in portfolio management. Partial lease square method was used to generate an index with 10 years accounting data from 191 companies listed in KLSE. The result shows that the index reflects the inner nature of the business and up to 30% of the stock return can be explained by the index.

Keywords: active portfolio management, asset allocation ex-ante investment, asset class, partial lease square

Procedia PDF Downloads 243
378 MITOS-RCNN: Mitotic Figure Detection in Breast Cancer Histopathology Images Using Region Based Convolutional Neural Networks

Authors: Siddhant Rao

Abstract:

Studies estimate that there will be 266,120 new cases of invasive breast cancer and 40,920 breast cancer induced deaths in the year of 2018 alone. Despite the pervasiveness of this affliction, the current process to obtain an accurate breast cancer prognosis is tedious and time consuming. It usually requires a trained pathologist to manually examine histopathological images and identify the features that characterize various cancer severity levels. We propose MITOS-RCNN: a region based convolutional neural network (RCNN) geared for small object detection to accurately grade one of the three factors that characterize tumor belligerence described by the Nottingham Grading System: mitotic count. Other computational approaches to mitotic figure counting and detection do not demonstrate ample recall or precision to be clinically viable. Our models outperformed all previous participants in the ICPR 2012 challenge, the AMIDA 2013 challenge and the MITOS-ATYPIA-14 challenge along with recently published works. Our model achieved an F- measure score of 0.955, a 6.11% improvement in accuracy from the most accurate of the previously proposed models.

Keywords: breast cancer, mitotic count, machine learning, convolutional neural networks

Procedia PDF Downloads 187
377 A Concept Analysis of Control over Nursing Practice

Authors: Oznur Ispir, S. Duygulu

Abstract:

Health institutions are the places where fast and efficient decisions are required and mistakes and uncertainties are not tolerated due to the urgency of the services provided within the body of these institutions. Thus, in those institutions where patient care services are targeted to be provided quality and safety, the nurses attending the decisions, creating the solutions for problems, taking initiative and bearing the responsibility of results in brief having the control over practices are needed. Control over nursing practices is defined as affecting the employment and work environment at the unit level of the institution, perceived freedom for organizing and evaluating nursing practices, the ability to make independent decisions about patient care and accountability for the results of such decisions. This study scrutinizes the concept of control over nursing practices (organizational autonomy), which is frequently confused with other concepts (autonomy) in the literature, by reviewing the literature and making suggestions to improve nurses’ control over nursing practices.

Keywords: control over nursing practice, nurse, nursing, organizational autonomy

Procedia PDF Downloads 239
376 Construction Time - Cost Trade-Off Analysis Using Fuzzy Set Theory

Authors: V. S. S. Kumar, B. Vikram, G. C. S. Reddy

Abstract:

Time and cost are the two critical objectives of construction project management and are not independent but intricately related. Trade-off between project duration and cost are extensively discussed during project scheduling because of practical relevance. Generally when the project duration is compressed, the project calls for an increase in labor and more productive equipments, which increases the cost. Thus, the construction time-cost optimization is defined as a process to identify suitable construction activities for speeding up to attain the best possible savings in both time and cost. As there is hidden tradeoff relationship between project time and cost, it might be difficult to predict whether the total cost would increase or decrease as a result of compressing the schedule. Different combinations of duration and cost for the activities associated with the project determine the best set in the time-cost optimization. Therefore, the contractors need to select the best combination of time and cost to perform each activity, all of which will ultimately determine the project duration and cost. In this paper, the fuzzy set theory is used to model the uncertainties in the project environment for time-cost trade off analysis.

Keywords: fuzzy sets, uncertainty, qualitative factors, decision making

Procedia PDF Downloads 615
375 An Introduction to Critical Chain Project Management Methodology

Authors: Ranjini Ramanath, Nanjunda P. Swamy

Abstract:

Construction has existed in our lives since time immemorial. However, unlike any other industry, construction projects have their own unique challenges – project type, purpose and end use of the project, geographical conditions, logistic arrangements, largely unorganized manpower and requirement of diverse skill sets, etc. These unique characteristics bring in their own level of risk and uncertainties to the project, which cause the project to deviate from its planned objectives of time, cost, quality, etc. over the many years, there have been significant developments in the way construction projects are conceptualized, planned, and managed. With the rapid increase in the population, increased rate of urbanization, there is a growing demand for infrastructure development, and it is required that the projects are delivered timely, and efficiently. In an age where ‘Time is Money,' implementation of new techniques of project management is required in leading to successful projects. This paper proposes a different approach to project management, which if applied in construction projects, can help in the accomplishment of the project objectives in a faster manner.

Keywords: critical chain project management methodology, critical chain, project management, construction management

Procedia PDF Downloads 390
374 Building a Lean Construction Body of Knowledge

Authors: Jyoti Singh, Ahmed Stifi, Sascha Gentes

Abstract:

The process of construction significantly contributes to high level of risks, complexity and uncertainties leading to cost and time overrun, customer dissatisfaction etc. lean construction is important as it is a comprehensive system of tools and concepts focusing on moving closer to customer satisfaction by understanding the process, identifying the waste and eliminating it. The proposed work includes identification of knowledge areas from lean perspective, lean tools/concepts used in lean construction and establishing a relationship matrix between knowledge areas and lean tools/concepts, thus developing and building up a lean construction body of knowledge (LCBOK), i.e. a guide to lean construction, aiming to provide guidelines to manage individual projects and also helping construction industry to minimise waste and maximize value to the customer. In this study, we identified 8 knowledge areas and 62 lean tools/concepts from lean perspective and also one tool can help to manage two or more knowledge areas.

Keywords: knowledge areas, lean body matrix, lean construction, lean tools

Procedia PDF Downloads 402
373 The Enhancement of Training of Military Pilots Using Psychophysiological Methods

Authors: G. Kloudova, M. Stehlik

Abstract:

Optimal human performance is a key goal in the professional setting of military pilots, which is a highly challenging atmosphere. The aviation environment requires substantial cognitive effort and is rich in potential stressors. Therefore, it is important to analyze variables such as mental workload to ensure safe conditions. Pilot mental workload could be measured using several tools, but most of them are very subjective. This paper details research conducted with military pilots using psychophysiological methods such as electroencephalography (EEG) and heart rate (HR) monitoring. The data were measured in a simulator as well as under real flight conditions. All of the pilots were exposed to highly demanding flight tasks and showed big individual response differences. On that basis, the individual pattern for each pilot was created counting different EEG features and heart rate variations. Later on, it was possible to distinguish the most difficult flight tasks for each pilot that should be more extensively trained. For training purposes, an application was developed for the instructors to decide which of the specific tasks to focus on during follow-up training. This complex system can help instructors detect the mentally demanding parts of the flight and enhance the training of military pilots to achieve optimal performance.

Keywords: cognitive effort, human performance, military pilots, psychophysiological methods

Procedia PDF Downloads 202
372 A Reinforcement Learning Approach for Evaluation of Real-Time Disaster Relief Demand and Network Condition

Authors: Ali Nadi, Ali Edrissi

Abstract:

Relief demand and transportation links availability is the essential information that is needed for every natural disaster operation. This information is not in hand once a disaster strikes. Relief demand and network condition has been evaluated based on prediction method in related works. Nevertheless, prediction seems to be over or under estimated due to uncertainties and may lead to a failure operation. Therefore, in this paper a stochastic programming model is proposed to evaluate real-time relief demand and network condition at the onset of a natural disaster. To address the time sensitivity of the emergency response, the proposed model uses reinforcement learning for optimization of the total relief assessment time. The proposed model is tested on a real size network problem. The simulation results indicate that the proposed model performs well in the case of collecting real-time information.

Keywords: disaster management, real-time demand, reinforcement learning, relief demand

Procedia PDF Downloads 269
371 Software Assessment Using Ant Colony Optimization Algorithm

Authors: Saad M. Darwish

Abstract:

Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However,these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.

Keywords: optimization technique, quality assurance, software certification model, software assessment

Procedia PDF Downloads 458
370 Prevalence of Rabbit Coccidia in Medea Province, Algeria

Authors: Mohamed Sadek Bachene, Soraya Temim, Hassina Ainbaziz, Asma Bachene

Abstract:

Coccidiosis has an economic impact for poultry and livestock. The current study examined the prevalence ofEimeria infections in domestic rabbits in Medea province, North of Algeria. A total of 414 faecal samples were collected from 50 farms in six regions of the province. Each faecal sample was subjected to oocyst counting andisolation. The Eimeria species from samples containing isolated and sporulatedoocysts were morphologically identified microscopically. The overall prevalence of coccidial infections was 47.6% (197/414). Weaners had the highest prevalence (77%, 77/100, p<0.0001), followed by growing rabbits (46.8%, 30/64), and the adult rabbits showed the lowest prevalence (36 %, 18/50). In breeding rabbits, females were more infected with a prevalence of40% (p<0.0001). Eleven rabbit Eimeria’s species were present and identified from oocyst positive samples. Eimeriamagna and Eimeria media were the most prevalent species (47.6% and 47.3%). Sulfonamides showed a better protection against rabbit coccidiosis than colistin and trimethoprim association (p< 0.0001, the prevalence of 23.3% vs.65.3%, respectively). These results indicated that the prevalence of coccidiosis is high among the rabbit population inMedea province, North of Algeria. As a conclusion, it seems that the epidemiological situation of rabbit coccidiosisin Medea province must be taken into consideration in order to minimize the economic losses caused by this parasitosis.

Keywords: eimeria, oryctolagus cuniculus, rabbit, sulfonamides

Procedia PDF Downloads 67
369 Uncertainty of the Brazilian Earth System Model for Solar Radiation

Authors: Elison Eduardo Jardim Bierhals, Claudineia Brazil, Deivid Pires, Rafael Haag, Elton Gimenez Rossini

Abstract:

This study evaluated the uncertainties involved in the solar radiation projections generated by the Brazilian Earth System Model (BESM) of the Weather and Climate Prediction Center (CPTEC) belonging to Coupled Model Intercomparison Phase 5 (CMIP5), with the aim of identifying efficiency in the projections for solar radiation of said model and in this way establish the viability of its use. Two different scenarios elaborated by Intergovernmental Panel on Climate Change (IPCC) were evaluated: RCP 4.5 (with more optimistic contour conditions) and 8.5 (with more pessimistic initial conditions). The method used to verify the accuracy of the present model was the Nash coefficient and the Statistical bias, as it better represents these atmospheric patterns. The BESM showed a tendency to overestimate the data ​​of solar radiation projections in most regions of the state of Rio Grande do Sul and through the validation methods adopted by this study, BESM did not present a satisfactory accuracy.

Keywords: climate changes, projections, solar radiation, uncertainty

Procedia PDF Downloads 215
368 Evolving Software Assessment and Certification Models Using Ant Colony Optimization Algorithm

Authors: Saad M. Darwish

Abstract:

Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However, these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.

Keywords: software quality, quality assurance, software certification model, software assessment

Procedia PDF Downloads 489
367 Morphometry of Cervical Spinal Cord in Rabbit Using Design-Based Stereology

Authors: Hamed Chavoshi Pour, Javad Sadeghinejad

Abstract:

The spinal cord is a long structure that starts at the end of the medulla oblongata and is located within the vertebral canal. Physiologically, the spinal cord connects the brain with the peripheral nervous system for sensory and motor activities. The cervical spinal cord is an area of particular interest in medicine and veterinary medicine due to the high prevalence of diseases in this region. This study describes the morphometric features of the cervical spinal cord in rabbits using design-unbiased stereology. The cervical spinal cords of five male rabbits were dissected, and slabs were taken according to systematic uniform random sampling. Each slab was embedded in paraffin and cut into a 6-µm thick section, and stained with cresyl violet 0.1% for stereological estimations. The total spinal cord volume, volume fraction of grey and white matter, and also dorsal and ventral horns were estimated using point counting and Cavalieri's estimator. The total cervical spinal cord volume was 0.98 ± 0.07 cm³. The relative volume of white matter and grey matter was 70.6 ± 1.7% and 29.31 ± 1.67%, respectively. The dorsal horn and ventral horn volume were 13.86 ± 1.36% and 14.9 ± 0.62% of the whole cervical spinal cord. This knowledge of rabbit spinal cord findings may serve as a foundation for a translational model in spinal cord experimental research and provide basic findings for the diagnosis and treatment of spinal cord disorders.

Keywords: stereology, spinal cord, rabbit, cervical

Procedia PDF Downloads 46
366 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 293
365 In-door Localization Algorithm and Appropriate Implementation Using Wireless Sensor Networks

Authors: Adeniran K. Ademuwagun, Alastair Allen

Abstract:

The relationship dependence between RSS and distance in an enclosed environment is an important consideration because it is a factor that can influence the reliability of any localization algorithm founded on RSS. Several algorithms effectively reduce the variance of RSS to improve localization or accuracy performance. Our proposed algorithm essentially avoids this pitfall and consequently, its high adaptability in the face of erratic radio signal. Using 3 anchors in close proximity of each other, we are able to establish that RSS can be used as reliable indicator for localization with an acceptable degree of accuracy. Inherent in this concept, is the ability for each prospective anchor to validate (guarantee) the position or the proximity of the other 2 anchors involved in the localization and vice versa. This procedure ensures that the uncertainties of radio signals due to multipath effects in enclosed environments are minimized. A major driver of this idea is the implicit topological relationship among sensors due to raw radio signal strength. The algorithm is an area based algorithm; however, it does not trade accuracy for precision (i.e the size of the returned area).

Keywords: anchor nodes, centroid algorithm, communication graph, radio signal strength

Procedia PDF Downloads 472
364 An Optimized Association Rule Mining Algorithm

Authors: Archana Singh, Jyoti Agarwal, Ajay Rana

Abstract:

Data Mining is an efficient technology to discover patterns in large databases. Association Rule Mining techniques are used to find the correlation between the various item sets in a database, and this co-relation between various item sets are used in decision making and pattern analysis. In recent years, the problem of finding association rules from large datasets has been proposed by many researchers. Various research papers on association rule mining (ARM) are studied and analyzed first to understand the existing algorithms. Apriori algorithm is the basic ARM algorithm, but it requires so many database scans. In DIC algorithm, less amount of database scan is needed but complex data structure lattice is used. The main focus of this paper is to propose a new optimized algorithm (Friendly Algorithm) and compare its performance with the existing algorithms A data set is used to find out frequent itemsets and association rules with the help of existing and proposed (Friendly Algorithm) and it has been observed that the proposed algorithm also finds all the frequent itemsets and essential association rules from databases as compared to existing algorithms in less amount of database scan. In the proposed algorithm, an optimized data structure is used i.e. Graph and Adjacency Matrix.

Keywords: association rules, data mining, dynamic item set counting, FP-growth, friendly algorithm, graph

Procedia PDF Downloads 385
363 Long Short-Term Memory (LSTM) Matters: A Sequential Brief Text that Assistive Approach of Text Summarization

Authors: Sharun Akter Khushbu

Abstract:

‘SOS’ addresses text summary such as feasibility study and allows more comprehensive methods on text of language resources. Resources language has been exploited by the importance of text documental procedure. Throughout this key idea will come out a machine interpreter called an SOS that has built an argumentative as an employed model is LSTM-CNN(long short-term memory- recurrent neural network). Summarization of Bengali text formulated by the information of latent structure instead of brief input string counting as text. Text summarization is the proper utilization of optimal solutions being time reduction, and easy interpretation whenever human-generated summary and machine targeted summary remain similar and without degrading the semantic summarization quality. According to the problem affirmation key idea has advanced an algorithm with the method of encoder and decoder describing a sequential structure that is rigorously connected with actual predicted and meaningful output. Regarding the seq2seq approach aimed in the future with high semantic summarization similarity on behalf of the large data samples that are also enlisted by the method. Thus, the SOS method assigns a discriminator over Bengali text documents where encoded input sequences such as summary and decoded the targeted summary of gist will be an error-free machine.

Keywords: LSTM-CNN, NN, SOS, text summarization

Procedia PDF Downloads 36
362 Towards Establishing a Universal Theory of Project Management

Authors: Divine Kwaku Ahadzie

Abstract:

Project management (PM) as a concept has evolved from the early 20th Century into a recognized academic and professional discipline, and indications are that it has come to stay in the 21st Century as a world-wide paradigm shift for managing successful construction projects. However, notwithstanding the strong inroads that PM has made in legitimizing its academic and professional status in construction management practice, the underlining philosophies are still based on cases and conventional practices. An important theoretical issue yet to be addressed is the lack of a universal theory that offers philosophical legitimacy for the PM concept as a uniquely specialized management concept. Here, it is hypothesized that the law of entropy, the theory of uncertainties and the theory of risk management offer plausible explanations for addressing the lacuna of what constitute PM theory. The theoretical bases of these plausible underlying theories are argued and attempts made to establish the functional relationships that exist between these theories and the PM concept. The paper then draws on data related to the success and/or failure of a number of construction projects to validate the theory.

Keywords: concepts, construction, project management, universal theory

Procedia PDF Downloads 300
361 New Insight into Fluid Mechanics of Lorenz Equations

Authors: Yu-Kai Ting, Jia-Ying Tu, Chung-Chun Hsiao

Abstract:

New physical insights into the nonlinear Lorenz equations related to flow resistance is discussed in this work. The chaotic dynamics related to Lorenz equations has been studied in many papers, which is due to the sensitivity of Lorenz equations to initial conditions and parameter uncertainties. However, the physical implication arising from Lorenz equations about convectional motion attracts little attention in the relevant literature. Therefore, as a first step to understand the related fluid mechanics of convectional motion, this paper derives the Lorenz equations again with different forced conditions in the model. Simulation work of the modified Lorenz equations without the viscosity or buoyancy force is discussed. The time-domain simulation results may imply that the states of the Lorenz equations are related to certain flow speed and flow resistance. The flow speed of the underlying fluid system increases as the flow resistance reduces. This observation would be helpful to analyze the coupling effects of different fluid parameters in a convectional model in future work.

Keywords: Galerkin method, Lorenz equations, Navier-Stokes equations, convectional motion

Procedia PDF Downloads 352
360 NENU2PHAR: PHA-Based Materials from Micro-Algae for High-Volume Consumer Products

Authors: Enrique Moliner, Alba Lafarga, Isaac Herraiz, Evelina Castellana, Mihaela Mirea

Abstract:

NENU2PHAR (GA 887474) is an EU-funded project aimed at the development of polyhydroxyalkanoates (PHAs) from micro-algae. These biobased and biodegradable polymers are being tested and validated in different high-volume market applications including food packaging, cosmetic packaging, 3D printing filaments, agro-textiles and medical devices, counting on the support of key players like Danone, BEL Group, Sofradim or IFG. At the moment the project has achieved to produce PHAs from micro-algae with a cumulated yield around 17%, i.e. 1 kg PHAs produced from 5.8 kg micro-algae biomass, which in turn capture 11 kg CO₂ for growing up. These algae-based plastics can therefore offer the same environmental benefits than current bio-based plastics (reduction of greenhouse gas emissions and fossil resource depletion), using a 3rd generation biomass feedstock that avoids the competition with food and the environmental impacts of agricultural practices. The project is also dealing with other sustainability aspects like the ecodesign and life cycle assessment of the plastic products targeted, considering not only the use of the biobased plastics but also many other ecodesign strategies. This paper will present the main progresses and results achieved to date in the project.

Keywords: NENU2PHAR, Polyhydroxyalkanoates, micro-algae, biopolymer, ecodesign, life cycle assessment

Procedia PDF Downloads 47
359 Uncertainty Estimation in Neural Networks through Transfer Learning

Authors: Ashish James, Anusha James

Abstract:

The impressive predictive performance of deep learning techniques on a wide range of tasks has led to its widespread use. Estimating the confidence of these predictions is paramount for improving the safety and reliability of such systems. However, the uncertainty estimates provided by neural networks (NNs) tend to be overconfident and unreasonable. Ensemble of NNs typically produce good predictions but uncertainty estimates tend to be inconsistent. Inspired by these, this paper presents a framework that can quantitatively estimate the uncertainties by leveraging the advances in transfer learning through slight modification to the existing training pipelines. This promising algorithm is developed with an intention of deployment in real world problems which already boast a good predictive performance by reusing those pretrained models. The idea is to capture the behavior of the trained NNs for the base task by augmenting it with the uncertainty estimates from a supplementary network. A series of experiments with known and unknown distributions show that the proposed approach produces well calibrated uncertainty estimates with high quality predictions.

Keywords: uncertainty estimation, neural networks, transfer learning, regression

Procedia PDF Downloads 95
358 Establishment of the Regression Uncertainty of the Critical Heat Flux Power Correlation for an Advanced Fuel Bundle

Authors: L. Q. Yuan, J. Yang, A. Siddiqui

Abstract:

A new regression uncertainty analysis methodology was applied to determine the uncertainties of the critical heat flux (CHF) power correlation for an advanced 43-element bundle design, which was developed by Canadian Nuclear Laboratories (CNL) to achieve improved economics, resource utilization and energy sustainability. The new methodology is considered more appropriate than the traditional methodology in the assessment of the experimental uncertainty associated with regressions. The methodology was first assessed using both the Monte Carlo Method (MCM) and the Taylor Series Method (TSM) for a simple linear regression model, and then extended successfully to a non-linear CHF power regression model (CHF power as a function of inlet temperature, outlet pressure and mass flow rate). The regression uncertainty assessed by MCM agrees well with that by TSM. An equation to evaluate the CHF power regression uncertainty was developed and expressed as a function of independent variables that determine the CHF power.

Keywords: CHF experiment, CHF correlation, regression uncertainty, Monte Carlo Method, Taylor Series Method

Procedia PDF Downloads 387
357 Seismic Response and Sensitivity Analysis of Circular Shallow Tunnels

Authors: Siti Khadijah Che Osmi, Mohammed Ahmad Syed

Abstract:

Underground tunnels are one of the most popular public facilities for various applications such as transportation, water transfer, network utilities and etc. Experience from the past earthquake reveals that the underground tunnels also become vulnerable components and may damage at certain percentage depending on the level of ground shaking and induced phenomena. In this paper a numerical analysis is conducted in evaluating the sensitivity of two types of circular shallow tunnel lining models to wide ranging changes in the geotechnical design parameter. Critical analysis has been presented about the current methods of analysis, structural typology, ground motion characteristics, effect of soil conditions and associated uncertainties on the tunnel integrity. The response of the tunnel is evaluated through 2D non-linear finite element analysis, which critically assesses the impact of increasing levels of seismic loads. The finding from this study offer significant information on improving methods to assess the vulnerability of underground structures.

Keywords: geotechnical design parameter, seismic response, sensitivity analysis, shallow tunnel

Procedia PDF Downloads 409
356 Comparative Effects of Homoplastic and Synthetic Pituitary Extracts on Induced Breeding of Heterobranchus longifilis (Valenciennes, 1840) in Indoor Hatchery Tanks in Owerri South East Nigeria

Authors: I. R. Keke, C. S. Nwigwe, O. S. Nwanjo, A. S. Egeruoh

Abstract:

An experiment was carried out at Urban Farm and Fisheries Nigeria Ltd, Owerri Imo State South East Nigeria between February and June 2014 to induce Brood stock of Heterobranchus longifilis (mean wt 1.3kg) in concrete tanks (1.0 x 2.0 x 1.5m) in dimension using a synthetic hormone (Ovaprim) and pituitary extract from Heterobranchus longifilis. Brood stock males were selected as pituitary donors and their weights matched those of females to be injected at 1ml/kg body weight of Fish. Ovaprim, was injected at 0.5ml/kg body weight of female fish. A latency period of 12 hours was allowed after injection of the Brood stock females before stripping the egg and incubation at 23 °C. While incubating the eggs, samples were drawn and the rate of fertilization was determined. Hatching occurred within 33 hours and hatchability rate (%) was determined by counting the active hatchings. The result showed that Ovaprim injected Brood stock eggs fertilized up to 80% while the pituitary from the Heterobranchus longifilis had low fertilization and hatching success 20%. Ovaprim is imported and costly, so more effort is required to enhance the procedures for homoplastic hypophysation.

Keywords: heterobranchus longifilis, ovaprim, hypophysation, latency period, pituitary

Procedia PDF Downloads 185
355 The Characteristics of Quantity Operation for 2nd and 3rd Grade Mathematics Slow Learners

Authors: Pi-Hsia Hung

Abstract:

The development of mathematical competency has individual benefits as well as benefits to the wider society. Children who begin school behind their peers in their understanding of number, counting, and simple arithmetic are at high risk of staying behind throughout their schooling. The development of effective strategies for improving the educational trajectory of these individuals will be contingent on identifying areas of early quantitative knowledge that influence later mathematics achievement. A computer-based quantity assessment was developed in this study to investigate the characteristics of 2nd and 3rd grade slow learners in quantity. The concept of quantification involves understanding measurements, counts, magnitudes, units, indicators, relative size, and numerical trends and patterns. Fifty-five tasks of quantitative reasoning—such as number sense, mental calculation, estimation and assessment of reasonableness of results—are included as quantity problem solving. Thus, quantity is defined in this study as applying knowledge of number and number operations in a wide variety of authentic settings. Around 1000 students were tested and categorized into 4 different performance levels. Students’ quantity ability correlated higher with their school math grade than other subjects. Around 20% students are below basic level. The intervention design implications of the preliminary item map constructed are discussed.

Keywords: mathematics assessment, mathematical cognition, quantity, number sense, validity

Procedia PDF Downloads 207
354 An Investigation on Interface Shear Resistance of Twinwall Units for Tank Structures

Authors: Jaylina Rana, Chanakya Arya, John Stehle

Abstract:

Hybrid precast twinwall concrete units, mainly used in basement, core and crosswall construction, are now being adopted in water retaining tank structures. Their use offers many advantages compared with conventional in-situ concrete alternatives, however, the design could be optimised further via a deeper understanding of the unique load transfer mechanisms in the system. In the tank application, twinwall units, which consist of two precast concrete biscuits connected by steel lattices and in-situ concrete core, are subject to bending. Uncertainties about the degree of composite action between the precast biscuits and hence flexural performance of the units necessitated laboratory tests to investigate the interface shear resistance. Testing was also required to assess both the leakage performance and buildability of a variety of joint details. This paper describes some aspects of this novel approach to the design/construction of tank structures as well as selected results from some of the tests that were carried out.

Keywords: hybrid construction, twinwall, precast construction, composite action

Procedia PDF Downloads 435
353 Traffic Analysis and Prediction Using Closed-Circuit Television Systems

Authors: Aragorn Joaquin Pineda Dela Cruz

Abstract:

Road traffic congestion is continually deteriorating in Hong Kong. The largest contributing factor is the increase in vehicle fleet size, resulting in higher competition over the utilisation of road space. This study proposes a project that can process closed-circuit television images and videos to provide real-time traffic detection and prediction capabilities. Specifically, a deep-learning model involving computer vision techniques for video and image-based vehicle counting, then a separate model to detect and predict traffic congestion levels based on said data. State-of-the-art object detection models such as You Only Look Once and Faster Region-based Convolutional Neural Networks are tested and compared on closed-circuit television data from various major roads in Hong Kong. It is then used for training in long short-term memory networks to be able to predict traffic conditions in the near future, in an effort to provide more precise and quicker overviews of current and future traffic conditions relative to current solutions such as navigation apps.

Keywords: intelligent transportation system, vehicle detection, traffic analysis, deep learning, machine learning, computer vision, traffic prediction

Procedia PDF Downloads 67
352 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: reliability approach, storage tanks, monte carlo simulation, seismic acceleration

Procedia PDF Downloads 271
351 Significant Factors in Agile Manufacturing and the Role of Product Architecture

Authors: Mehrnoosh Askarizadeh

Abstract:

Agile manufacturing concept was first coined by Iacocca institute in 1991 as a new manufacturing paradigm in order to provide and ensure competitiveness in the emerging global manufacturing order. Afterward, a considerable number of studies have been conducted in this area. Reviewing these studies reveals that they mostly focus on agile manufacturing drivers, definition and characteristics but few of them propose practical solutions to achieve it. Agile manufacturing is recommended as a successful paradigm after lean for the 21st manufacturing firms. This competitive concept has been developed in response to the continuously changes and uncertainties in today’s business environment. In order to become an agile competitor, a manufacturing firm should focus on enriching its agility capabilities. These agility capabilities can be categorized into seven groups: proactiveness, customer focus, responsiveness, quickness, flexibility, basic competence and partnership. A manufacturing firm which is aiming at achieving agility should first develop its own appropriate agility strategy. This strategy prioritizes required agility capabilities.

Keywords: agile manufacturing, product architecture, customer focus, responsiveness, quickness, flexibility, basic competence

Procedia PDF Downloads 486
350 Readout Development of a LGAD-based Hybrid Detector for Microdosimetry (HDM)

Authors: Pierobon Enrico, Missiaggia Marta, Castelluzzo Michele, Tommasino Francesco, Ricci Leonardo, Scifoni Emanuele, Vincezo Monaco, Boscardin Maurizio, La Tessa Chiara

Abstract:

Clinical outcomes collected over the past three decades have suggested that ion therapy has the potential to be a treatment modality superior to conventional radiation for several types of cancer, including recurrences, as well as for other diseases. Although the results have been encouraging, numerous treatment uncertainties remain a major obstacle to the full exploitation of particle radiotherapy. To overcome therapy uncertainties optimizing treatment outcome, the best possible radiation quality description is of paramount importance linking radiation physical dose to biological effects. Microdosimetry was developed as a tool to improve the description of radiation quality. By recording the energy deposition at the micrometric scale (the typical size of a cell nucleus), this approach takes into account the non-deterministic nature of atomic and nuclear processes and creates a direct link between the dose deposited by radiation and the biological effect induced. Microdosimeters measure the spectrum of lineal energy y, defined as the energy deposition in the detector divided by most probable track length travelled by radiation. The latter is provided by the so-called “Mean Chord Length” (MCL) approximation, and it is related to the detector geometry. To improve the characterization of the radiation field quality, we define a new quantity replacing the MCL with the actual particle track length inside the microdosimeter. In order to measure this new quantity, we propose a two-stage detector consisting of a commercial Tissue Equivalent Proportional Counter (TEPC) and 4 layers of Low Gain Avalanche Detectors (LGADs) strips. The TEPC detector records the energy deposition in a region equivalent to 2 um of tissue, while the LGADs are very suitable for particle tracking because of the thickness thinnable down to tens of micrometers and fast response to ionizing radiation. The concept of HDM has been investigated and validated with Monte Carlo simulations. Currently, a dedicated readout is under development. This two stages detector will require two different systems to join complementary information for each event: energy deposition in the TEPC and respective track length recorded by LGADs tracker. This challenge is being addressed by implementing SoC (System on Chip) technology, relying on Field Programmable Gated Arrays (FPGAs) based on the Zynq architecture. TEPC readout consists of three different signal amplification legs and is carried out thanks to 3 ADCs mounted on a FPGA board. LGADs activated strip signal is processed thanks to dedicated chips, and finally, the activated strip is stored relying again on FPGA-based solutions. In this work, we will provide a detailed description of HDM geometry and the SoC solutions that we are implementing for the readout.

Keywords: particle tracking, ion therapy, low gain avalanche diode, tissue equivalent proportional counter, microdosimetry

Procedia PDF Downloads 133