Search results for: time domain reflectometry (TDR)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19375

Search results for: time domain reflectometry (TDR)

14575 Neighbourhood Walkability and Quality of Life: The Mediating Role of Place Adherence and Social Interaction

Authors: Michał Jaśkiewicz

Abstract:

The relation between walkability, place adherence, social relations and quality of life was explored in a Polish context. A considerable number of studies have suggested that environmental factors may influence the quality of life through indirect pathways. The list of possible psychological mediators includes social relations and identity-related variables. Based on the results of Study 1, local identity is a significant mediator in the relationship between neighbourhood walkability and quality of life. It was assumed that pedestrian-oriented neighbourhoods enable residents to interact and that these spontaneous interactions can help to strengthen a sense of local identity, thus influencing the quality of life. We, therefore, conducted further studies, testing the relationship experimentally in studies 2a and 2b. Participants were exposed to (2a) photos of walkable/non-walkable neighbourhoods or (2b) descriptions of high/low-walkable neighbourhoods. They were then asked to assess the walkability of the neighbourhoods and to evaluate their potential social relations and quality of life in these places. In both studies, social relations with neighbours turned out to be a significant mediator between walkability and quality of life. In Study 3, we implemented the measure of overlapping individual and communal identity (fusion with the neighbourhood) and willingness to collective action as mediators. Living in a walkable neighbourhood was associated with identity fusion with that neighbourhood. Participants who felt more fused expressed greater willingness to engage in collective action with other neighbours. Finally, this willingness was positively related to the quality of life in the city. In Study 4, we used commuting time (an aspect of walkability related to the time that people spend travelling to work) as the independent variable. The results showed that a shorter average daily commuting time was linked to more frequent social interactions in the neighbourhood. Individuals who assessed their social interactions as more frequent expressed a stronger city identification, which was in turn related to quality of life. To sum up, our research replicated and extended previous findings on the association between walkability and well-being measures. We introduced potential mediators of this relationship: social interactions in the neighbourhood and identity-related variables.

Keywords: walkability, quality of life, social relations, analysis of mediation

Procedia PDF Downloads 327
14574 Graphic Calculator Effectiveness in Biology Teaching and Learning

Authors: Nik Azmah Nik Yusuff, Faridah Hassan Basri, Rosnidar Mansor

Abstract:

The purpose of the study is to find out the effectiveness of using Graphic calculators (GC) with Calculator Based Laboratory 2 (CBL2) in teaching and learning of form four biology for these topics: Nutrition, Respiration and Dynamic Ecosystem. Sixty form four science stream students were the participants of this study. The participants were divided equally into the treatment and control groups. The treatment group used GC with CBL2 during experiments while the control group used the ordinary conventional laboratory apparatus without using GC with CBL2. Instruments in this study were a set of pre-test and post-test and a questionnaire. T-Test was used to compare the student’s biology achievement while a descriptive statistic was used to analyze the outcome of the questionnaire. The findings of this study indicated the use of GC with CBL2 in biology had significant positive effect. The highest mean was 4.43 for item stating the use of GC with CBL2 had saved collecting experiment result’s time. The second highest mean was 4.10 for item stating GC with CBL2 had saved drawing and labelling graphs. The outcome from the questionnaire also showed that GC with CBL2 were easy to use and save time. Thus, teachers should use GC with CBL2 in support of efforts by Malaysia Ministry of Education in encouraging technology-enhanced lessons.

Keywords: biology experiments, Calculator-Based Laboratory 2 (CBL2), graphic calculators, Malaysia Secondary School, teaching/learning

Procedia PDF Downloads 403
14573 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire

Authors: Vinay A. Sharma, Shiva Prasad H. C.

Abstract:

The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.

Keywords: decision-making, leadership, logic, strategic management

Procedia PDF Downloads 108
14572 Monte Carlo and Biophysics Analysis in a Criminal Trial

Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano

Abstract:

In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.

Keywords: biophysics analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion

Procedia PDF Downloads 131
14571 Conceptual Study on 4PL and Activities in Turkey

Authors: Berna Kalkan, Kenan Aydin

Abstract:

Companies give importance customer satisfaction to compete the developing and changing market. This is possible when customer reaches the right product, right quality, place, time and cost. In this regard, the extension of logistics services has played active role on formation and development of the different logistics services concept. The concept of logistics services has played important role involved in the healing of economic indicators today. Companies can use logistics providers, thus have competitive advantage and low cost, reducing time, tobe flexibility. In recent years, Fourth Party Logistics (4PL) has emerged as a new concept that includes relationship between suppliers and firms in outsourcing. 4PL provider is an integrator that offers comprehensive supply chain solutions with the technology, resources and capabilities that it possesses. Also, 4PL has attracted as a popular research topic attention in the recent past. In this paper, logistics outsourcing and 4PL concepts are analyzed and a literature review on 4PL activities is given. Also, the previous studies in literature and the approaches that are used in previous studies in literature is presented by analysing on 4PL activities. In this context, a field study will be applied to 4PL providers and service buyer in Turkey. If necessary, results related to this study will be shared in scientific areas.

Keywords: fourth party logistics, literature review, outsourcing, supply chain management

Procedia PDF Downloads 178
14570 Analysis of Brain Specific Creatine Kinase of Postmortem Cerebrospinal Fluid and Serum in Blunt Head Trauma Cases

Authors: Rika Susanti, Eryati Darwin, Dedi Afandi, Yanwirasti, Syahruddin Said, Noverika Windasari, Zelly Dia Rofinda

Abstract:

Introduction: Blunt head trauma is one of the leading causes of death associated with murders and other deaths involved in criminal acts. Creatine kinase (CKBB) levels have been used as a biomarker for blunt head trauma. Therefore, it is now used as an alternative to an autopsy. The aim of this study is to investigate CKBB levels in cerebrospinal fluid (CSF) and post-mortem serum in order to deduce the cause and time of death. Method: This investigation was conducted through post-test–only group design involving deaths caused by blunt head trauma, which was compared to deaths caused by ketamine poisoning. Results: There were eight treatment groups, each consisting of six adult rats (Rattus norvegicus) Sprague-Dawley strain. Examinations were done at 0 hours, 1 hour, 2 hours, and 3 hours post-mortem, which followed by brain tissue observation. Data were then analyzed statistically with a repeated-measures general linear model. Conclusion: There were increases in the level of CKBB in CSF and postmortem serum in both blunt head trauma and ketamine poisoning treatment groups. However, there were no significant differences between these two groups.

Keywords: blunt head trauma, CKBB, the cause of death, estimated time of death

Procedia PDF Downloads 192
14569 Evaluating the Success of an Intervention Course in a South African Engineering Programme

Authors: Alessandra Chiara Maraschin, Estelle Trengove

Abstract:

In South Africa, only 23% of engineering students attain their degrees in the minimum time of 4 years. This begs the question: Why is the 4-year throughput rate so low? Improving the throughput rate is crucial in assisting students to the shortest possible path to completion. The Electrical Engineering programme has a fixed curriculum and students must pass all courses in order to graduate. In South Africa, as is the case in several other countries, many students rely on external funding such as bursaries from companies in industry. If students fail a course, they often lose their bursaries, and most might not be able to fund their 'repeating year' fees. It is thus important to improve the throughput rate, since for many students, graduating from university is a way out of poverty for an entire family. In Electrical Engineering, it has been found that the Software Development I course (an introduction to C++ programming) is a significant hurdle course for students and has been found to have a low pass rate. It has been well-documented that students struggle with this type of course as it introduces a number of new threshold concepts that can be challenging to grasp in a short time frame. In an attempt to mitigate this situation, a part-time night-school for Software Development I was introduced in 2015 as an intervention measure. The course includes all the course material from the Software Development I module and allows students who failed the course in first semester a second chance by repeating the course through taking the night-school course. The purpose of this study is to determine whether the introduction of this intervention course could be considered a success. The success of the intervention is assessed in two ways. The study will first look at whether the night-school course contributed to improving the pass rate of the Software Development I course. Secondly, the study will examine whether the intervention contributed to improving the overall throughput from the 2nd year to the 3rd year of study at a South African University. Second year academic results for a sample of 1216 students have been collected from 2010-2017. Preliminary results show that the lowest pass rate for Software Development I was found to be in 2017 with a pass rate of 34.9%. Since the intervention course's inception, the pass rate for Software Development I has increased each year from 2015-2017 by 13.75%, 25.53% and 25.81% respectively. To conclude, the preliminary results show that the intervention course is a success in improving the pass rate of Software Development I.

Keywords: academic performance, electrical engineering, engineering education, intervention course, low pass rate, software development course, throughput

Procedia PDF Downloads 164
14568 Remediation of Oil and Gas Exploration and Production (O&G E&P) Wastes Using Soil-Poultry Dropping Amendment

Authors: Ofonime U. M. John, Justina I. R. Udotong, Victor O. Nwaugo, Ime R. Udotong

Abstract:

Oily wastes from oil and gas exploration and production (O&G E&P) activities were remediated for twelve weeks using Soil-Poultry dropping amendment. Culture-dependent microbiological, chemical and enzymatic techniques were employed to assess the efficacy of remediation process. Microbiological activities of the remediated wastes showed increased hydrocarbonoclastic microbial populations with increased remediation time; 2.7±0.1 x 105cfu/g to 8.3 ± 0.04 x106cfu/g for hydrocarbon utilizing bacteria, 1.7 ± 0.2 x103cfu/g to 6.0 ± 0.01 x 104cfu/g for hydrocarbon utilizing fungi and 2.2 ± 0.1 x 102cfu/g to 6.7 ± 0.1 x 103cfu/g for hydrocarbon utilizing actinomycetes. Bacteria associated with the remediated wastes after the remediation period included the genera Bacillus, Psuedomonas, Beijerinckia, Acinetobacter, Alcaligenes and Serratia. Fungal isolates included species of Penicillium, Aspergillus and Cladosporium, while the Actinomycetes included species of Rhodococcus, Nocardia and Streptomyces. Slight fluctuations in pH values between 6.5± 0.2 and 7.1 ± 0.08 were recorded throughout the process, while total petroleum hydrocarbon (TPH) content decreased from 89, 900 ± 0.03mg/kg to 425 ± 0.1 mg/kg after twelve weeks of remediation. The polycyclic aromatic hydrocarbon (PAH) levels decreased with increased remediation time; naphthalene, flourene, pheneanthrene, anthracene, pyrene, chrysene and benzo(b)flouranthene showed decreased values < 0.01 after twelve weeks of remediation. Enzyme activities revealed increased dehydrogenase and urease activities with increased remediation time and decreased phenol oxidase activity with increased remediation period. There was a positive linear correlation between densities of hydrocarbonoclastic microbes and dehydrogenase activity. On the contrary, phenol oxidase and urease activities showed negative correlation with microbial population. Results of this study confirmed that remediation of oily wastes using soil-poultry dropping amendment can result in eco-friendly O&G E&P wastes. It also indicates that urease and phenol oxidase activities can be reliable indices/tools to monitor PAH levels and rates of petroleum hydrocarbon degradation.

Keywords: dehydrogenase activity, oily wastes, remediation, soil-poultry dropping amendment

Procedia PDF Downloads 322
14567 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning

Authors: Tanvi P. Patel, Warish D. Patel

Abstract:

Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.

Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern

Procedia PDF Downloads 371
14566 Socio-Technical Systems: Transforming Theory into Practice

Authors: L. Ngowi, N. H. Mvungi

Abstract:

This paper critically examines the evolution of socio-technical systems theory, its practices, and challenges in system design and development. It examines concepts put forward by researchers focusing on the application of the theory in software engineering. There are various methods developed that use socio-technical concepts based on systems engineering without remarkable success. The main constraint is the large amount of data and inefficient techniques used in the application of the concepts in system engineering for developing time-bound systems and within a limited/controlled budget. This paper critically examines each of the methods, highlight bottlenecks and suggest the way forward. Since socio-technical systems theory only explains what to do, but not how doing it, hence engineers are not using the concept to save time, costs and reduce risks associated with new frameworks. Hence, a new framework, which can be considered as a practical approach is proposed that borrows concepts from soft systems method, agile systems development and object-oriented analysis and design to bridge the gap between theory and practice. The approach will enable the development of systems using socio-technical systems theory to attract/enable the system engineers/software developers to use socio-technical systems theory in building worthwhile information systems to avoid fragilities and hostilities in the work environment.

Keywords: socio-technical systems, human centered design, software engineering, cognitive engineering, soft systems, systems engineering

Procedia PDF Downloads 286
14565 Tourism Potentials of Ikogosi Warm Spring in Nigeria

Authors: A.I. Adeyemo

Abstract:

Ikogosi warm spring results from a complex mechanical and chemical forces that generates internal heat in the rocks forming a warm and cold water at the same geographical location at the same time. From time immemorial, the local community had thought, it to be the work of a deity, and they were worshipping the spring. This complex phenomenon has been a source of tourist attraction to both local and international tourists over the years. 450 copies of a structured questionnaire were given out, and a total of 500 respondents were interviewed. The result showed that ikogosi warm spring impacts the community positively by providing employment to the teeming youths, and it provides income to traders. The result shows that 66% of the respondents confirmed that it increased their income and that transportation business increased more than 73%.the level of enlightenment and socialization increased greatly in the community. However, it also impacted the community negatively as it increased crime rates such as stealing, kidnapping, prostitution, and unwanted pregnancy among the secondary school girls and the other teenagers. Generally, 50% of the respondents reported that tourism in the warm spring results in insecurity in the community. IT also increased environmental problems such as noise and waste pollutions; the continuous movement on the land results in soil compartment leading to erosion, and leaching, which also results in loss of soil fertility. It was concluded that if the potentials of the spring are fully tapped, it will be a good avenue for income generation to the country.

Keywords: community, Ikogosi, revenue, warm spring

Procedia PDF Downloads 159
14564 An Automatic Large Classroom Attendance Conceptual Model Using Face Counting

Authors: Sirajdin Olagoke Adeshina, Haidi Ibrahim, Akeem Salawu

Abstract:

large lecture theatres cannot be covered by a single camera but rather by a multicamera setup because of their size, shape, and seating arrangements. Although, classroom capture is achievable through a single camera. Therefore, a design and implementation of a multicamera setup for a large lecture hall were considered. Researchers have shown emphasis on the impact of class attendance taken on the academic performance of students. However, the traditional method of carrying out this exercise is below standard, especially for large lecture theatres, because of the student population, the time required, sophistication, exhaustiveness, and manipulative influence. An automated large classroom attendance system is, therefore, imperative. The common approach in this system is face detection and recognition, where known student faces are captured and stored for recognition purposes. This approach will require constant face database updates due to constant changes in the facial features. Alternatively, face counting can be performed by cropping the localized faces on the video or image into a folder and then count them. This research aims to develop a face localization-based approach to detect student faces in classroom images captured using a multicamera setup. A selected Haar-like feature cascade face detector trained with an asymmetric goal to minimize the False Rejection Rate (FRR) relative to the False Acceptance Rate (FAR) was applied on Raspberry Pi 4B. A relationship between the two factors (FRR and FAR) was established using a constant (λ) as a trade-off between the two factors for automatic adjustment during training. An evaluation of the proposed approach and the conventional AdaBoost on classroom datasets shows an improvement of 8% TPR (output result of low FRR) and 7% minimization of the FRR. The average learning speed of the proposed approach was improved with 1.19s execution time per image compared to 2.38s of the improved AdaBoost. Consequently, the proposed approach achieved 97% TPR with an overhead constraint time of 22.9s compared to 46.7s of the improved Adaboost when evaluated on images obtained from a large lecture hall (DK5) USM.

Keywords: automatic attendance, face detection, haar-like cascade, manual attendance

Procedia PDF Downloads 72
14563 Filtering Momentum Life Cycles, Price Acceleration Signals and Trend Reversals for Stocks, Credit Derivatives and Bonds

Authors: Periklis Brakatsoulas

Abstract:

Recent empirical research shows a growing interest in investment decision-making under market anomalies that contradict the rational paradigm. Momentum is undoubtedly one of the most robust anomalies in the empirical asset pricing research and remains surprisingly lucrative ever since first documented. Although predominantly phenomena identified across equities, momentum premia are now evident across various asset classes. Yet few many attempts are made so far to provide traders a diversified portfolio of strategies across different assets and markets. Moreover, literature focuses on patterns from past returns rather than mechanisms to signal future price directions prior to momentum runs. The aim of this paper is to develop a diversified portfolio approach to price distortion signals using daily position data on stocks, credit derivatives, and bonds. An algorithm allocates assets periodically, and new investment tactics take over upon price momentum signals and across different ranking groups. We focus on momentum life cycles, trend reversals, and price acceleration signals. The main effort here concentrates on the density, time span and maturity of momentum phenomena to identify consistent patterns over time and measure the predictive power of buy-sell signals generated by these anomalies. To tackle this, we propose a two-stage modelling process. First, we generate forecasts on core macroeconomic drivers. Secondly, satellite models generate market risk forecasts using the core driver projections generated at the first stage as input. Moreover, using a combination of the ARFIMA and FIGARCH models, we examine the dependence of consecutive observations across time and portfolio assets since long memory behavior in volatilities of one market appears to trigger persistent volatility patterns across other markets. We believe that this is the first work that employs evidence of volatility transmissions among derivatives, equities, and bonds to identify momentum life cycle patterns.

Keywords: forecasting, long memory, momentum, returns

Procedia PDF Downloads 102
14562 The Droplet Generation and Flow in the T-Shape Microchannel with the Side Wall Fluctuation

Authors: Yan Pang, Xiang Wang, Zhaomiao Liu

Abstract:

Droplet microfluidics, in which nanoliter to picoliter droplets acted as individual compartments, are common to a diverse array of applications such as analytical chemistry, tissue engineering, microbiology and drug discovery. The droplet generation in a simplified two dimension T-shape microchannel with the main channel width of 50 μm and the side channel width of 25 μm, is simulated to investigate effects of the forced fluctuation of the side wall on the droplet generation and flow. The periodic fluctuations are applied on a length of the side wall in the main channel of the T-junction with the deformation shape of the double-clamped beam acted by the uniform force, which varies with the flow time and fluctuation periods, forms and positions. The fluctuations under most of the conditions expand the distribution range of the droplet size but have a little effect on the average size, while the shape of the fixed side wall changes the average droplet size chiefly. Droplet sizes show a periodic pattern along the relative time when the fluctuation is forced on the side wall near the T-junction. The droplet emerging frequency is not varied by the fluctuation of the side wall under the same flow rate and geometry conditions. When the fluctuation period is similar with the droplet emerging period, the droplet size shows a nice stability as the no fluctuation case.

Keywords: droplet generation, droplet size, flow flied, forced fluctuation

Procedia PDF Downloads 282
14561 Exploring Fertility Dynamics in the MENA Region: Distribution, Determinants, and Temporal Trends

Authors: Dena Alhaloul

Abstract:

The Middle East and North Africa (MENA) region is characterized by diverse cultures, economies, and social structures. Fertility rates in MENA have seen significant changes over time, with variations among countries and subregions. Understanding fertility patterns in this region is essential due to its impact on demographic dynamics, healthcare, labor markets, and social policies. Rising or declining fertility rates have far-reaching consequences for the region's socioeconomic development. The main thrust of this study is to comprehensively examine fertility rates in the Middle East and North Africa (MENA) region. It aims to understand the distribution, determinants, and temporal trends of fertility rates in MENA countries. The study seeks to provide insights into the factors influencing fertility decisions, assess how fertility rates have evolved over time, and potentially develop statistical models to characterize these trends. As for the methodology of the study, the study uses descriptive statistics to summarize and visualize fertility rate data. It also uses regression analyses to identify determinants of fertility rates as well as statistical modeling to characterize temporal trends in fertility rates. The conclusion of this study The research will contribute to a deeper understanding of fertility dynamics in the MENA region, shedding light on the distribution of fertility rates, their determinants, and historical trends.

Keywords: fertility, distribution, modeling, regression

Procedia PDF Downloads 81
14560 Effect of Nitriding and Shot Peening on Corrosion Behavior and Surface Properties of Austenite Stainless Steel 316L

Authors: Khiaira S. Hassan, Abbas S. Alwan, Muna K. Abbass

Abstract:

This research aims to study the effect of the liquid nitriding and shot peening on the hardness, surface roughness, residual stress, microstructure and corrosion behavior of austenite stainless steel 316 L. Chemical surface heat treatment by liquid nitriding process was carried out at 500 °C for 1 h and followed by shot peening with using ball steel diameter of 1.25 mm in different exposure time of 10 and 20 min. Electrochemical corrosion test was applied in sea water (3.5% NaCl solution) by using potentostat instrument. The results showed that the nitride layer consists of a compound layer (white layer) and diffusion zone immediately below the alloy layer. It has been found that the mechanical treatment (shot peening) has led to the formation of compressive residual stresses in layer surface that increased the hardness of stainless steel surface. All surface treatment (nitriding and shot peening) processes have led to the formation of carbide of CrN in hard surface layer. It was shown that both processes caused an increase in surface hardness and roughness which increases with shot peening time. Also, the corrosion results showed that the liquid nitriding and shot peening processes increase the corrosion rate to values more than that of not treated stainless steel.

Keywords: stainless steel 316L, shot peening, nitriding, corrosion, hardness

Procedia PDF Downloads 468
14559 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 150
14558 Optimization of Pregelatinized Taro Boloso-I Starch as a Direct Compression Tablet Excipient

Authors: Tamrat Balcha Balla

Abstract:

Background: Tablets are still the most preferred means of drug delivery. The search for new and improved direct compression tablet excipients is an area of research focus. Taro Boloso-I is a variety of Colocasia esculenta (L. Schott) yielding 67% more than the other varieties (Godare) in Ethiopia. This study aimed to enhance the flowability while keeping the compressibility and compactibility of the pregelatinized Taro Boloso-I starch. Methods: Central composite design was used for the optimization of two factors which were the temperature and duration of pregelatinization against 5 responses. The responses were angle of repose, Hausner ratio, Kawakita compressibility index, mean yield pressure and tablet breaking force. Results and Discussions: An increase in both temperature and time resulted in decrease in the angle of repose. The increase in temperature was shown to decrease the Hausner ratio and to decrease the Kawakita compressibility index. The mean yield pressure was observed to increase with increasing levels of both temperature and time. The pregelatinized (optimized) Taro Boloso-I starch could show desired flow property and compressibility. Conclusions: Pregelatinized Taro Boloso - I starch could be regarded as a potential direct compression excipient in terms of flowability, compressibility and compactibility.

Keywords: starch, compression, pregelatinization, Taro Boloso-I

Procedia PDF Downloads 113
14557 Improving the Performance of Road Salt on Anti-Icing

Authors: Mohsen Abotalebi Esfahani, Amin Rahimi

Abstract:

Maintenance and management of route and roads infrastructure is one of the most important and the most fundamental principles of the countries. Several methods have been under investigation as preventive proceedings for the maintenance of asphalt pavements for many years. Using a mixture of salt, sand and gravel is the most common method of deicing, which could have numerous harmful consequences. Icy or snow-covered road is one of the major reasons of accidents in rainy seasons, which causes substantial damages such as loss of time and energy, environmental pollution, destruction of buildings, traffic congestion and rising possibility of accidents. Regarding this, every year the government incurred enormous costs to secure traverses. In this study, asphalt pavements have been cured, in terms of compressive strength, tensile strength and resilient modulus of asphalt samples, under the influence of Magnesium Chloride, Calcium Chloride, Sodium Chloride, Urea and pure water; and showed that de-icing with the calcium chloride solution and urea have the minimum negative effect and de-icing with pure water has most negative effect on laboratory specimens. Hence some simple techniques and new equipment and less use of sand and salt, can reduce significantly the risks and harmful effects of excessive use of salt, sand and gravel and at the same time use the safer roads.

Keywords: maintenance, sodium chloride, icyroad, calcium chloride

Procedia PDF Downloads 284
14556 Reasons for Language Words in the Quran and Literary Approaches That Are Persian

Authors: Fateme Mazbanpoor, Sayed Mohammad Amiri

Abstract:

In this article, we will examine the Persian words in Quran and study the reasons of their presence in this holy book. Writers of this paper extracted about 70 Persian words of Quran by referring to resources. (Alalfaz ol Moarab ol Farsieh Edishir, Almoarabol Javalighi, Almahzab va Etghan Seuti; Vocabulary involved in Quran Arthur Jeffry;, and etc…), some of these words are: ‘Abarigh, ‘Estabragh’,’Barzakh’, ‘Din’,’Zamharir, ‘Sondos’ ‘Sejil’,’ Namaregh’, ‘Fil’ etc. These Persian words have entered Arabic and finally entered Quran in two ways: 1) directly from Persian language, 2) via other languages. The first way: because of the Iranian dominance on Hira, Yemen, whole Oman and Bahrein land in Sasanian period, there were political, religious, linguistic, literary, and trade ties between these Arab territories causing the impact of Persian on Arabic; giving way to many Persian-loan words into Arabic in this period of time. The second way: Since the geographical and business conditions of the areas were dominated by Iran, Hejaz had lots of deals and trades with Mesopotamia and Yemen. On the other hand, Arabic language which was relatively a young language at that time, used to be impressed by Semitic languages in order to expand its vocabulary (Syrian and Aramaic were influenced by the languages of Iran). Consequently, due to the long relationship between Iranian and Arabs, some of the Persian words have taken longer ways through Aramaic and Syrian to find their way into Quran.

Keywords: Quran, Persian word, Arabic language, Persian

Procedia PDF Downloads 462
14555 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators

Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros

Abstract:

Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.

Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis

Procedia PDF Downloads 139
14554 Advances in Machine Learning and Deep Learning Techniques for Image Classification and Clustering

Authors: R. Nandhini, Gaurab Mudbhari

Abstract:

Ranging from the field of health care to self-driving cars, machine learning and deep learning algorithms have revolutionized the field with the proper utilization of images and visual-oriented data. Segmentation, regression, classification, clustering, dimensionality reduction, etc., are some of the Machine Learning tasks that helped Machine Learning and Deep Learning models to become state-of-the-art models for the field where images are key datasets. Among these tasks, classification and clustering are essential but difficult because of the intricate and high-dimensional characteristics of image data. This finding examines and assesses advanced techniques in supervised classification and unsupervised clustering for image datasets, emphasizing the relative efficiency of Convolutional Neural Networks (CNNs), Vision Transformers (ViTs), Deep Embedded Clustering (DEC), and self-supervised learning approaches. Due to the distinctive structural attributes present in images, conventional methods often fail to effectively capture spatial patterns, resulting in the development of models that utilize more advanced architectures and attention mechanisms. In image classification, we investigated both CNNs and ViTs. One of the most promising models, which is very much known for its ability to detect spatial hierarchies, is CNN, and it serves as a core model in our study. On the other hand, ViT is another model that also serves as a core model, reflecting a modern classification method that uses a self-attention mechanism which makes them more robust as this self-attention mechanism allows them to lean global dependencies in images without relying on convolutional layers. This paper evaluates the performance of these two architectures based on accuracy, precision, recall, and F1-score across different image datasets, analyzing their appropriateness for various categories of images. In the domain of clustering, we assess DEC, Variational Autoencoders (VAEs), and conventional clustering techniques like k-means, which are used on embeddings derived from CNN models. DEC, a prominent model in the field of clustering, has gained the attention of many ML engineers because of its ability to combine feature learning and clustering into a single framework and its main goal is to improve clustering quality through better feature representation. VAEs, on the other hand, are pretty well known for using latent embeddings for grouping similar images without requiring for prior label by utilizing the probabilistic clustering method.

Keywords: machine learning, deep learning, image classification, image clustering

Procedia PDF Downloads 10
14553 Synchronous Versus Asynchronous Telecollaboration in Intercultural Communication

Authors: Vita Kalnberzina, Lauren Miller Anderson

Abstract:

The aim of the paper is to report on the results of the telecollaboration project results carried out between the students of the University of Latvia, National Louis University in the US, and Austral University in Chili during the Intercultural Communication course. The objectives of the study are 1) to compare different forms of student telecollaboration and virtual exchange, 2) to collect and analyse the student feedback on the telecollaboration project, 3) to evaluate the products (films) produced during the telecollaboration project. The methods of research used are as follows: Survey of the student feedback after the project, video text analysis of the films produced by the students, and interview of the students participating in the project. We would like to compare the results of a three-year collaboration project, where we tried out synchronous telecollaboration and asynchronous collaboration. The different variables that were observed were the impact of the different time zones, different language proficiency levels of students, and different curricula developed for collaboration. The main findings suggest that the effort spent by students to organize meetings in different time zones and to get to know each other diminishes the quality of the product developed and thus reduces the students' feeling of accomplishment. Therefore, we would like to propose that asynchronous collaboration where the national teams work on a film project specifically developed by the students of one university for the students of another university ends up with a better quality film, which in its turn appeals more to the students of the other university and creates a deeper intercultural bond between the collaborating students.

Keywords: telecollaboration, intercultural communication, synchronous collaboration, asynchronous collaboration

Procedia PDF Downloads 101
14552 Forecasting Future Demand for Energy Efficient Vehicles: A Review of Methodological Approaches

Authors: Dimitrios I. Tselentis, Simon P. Washington

Abstract:

Considerable literature has been focused over the last few decades on forecasting the consumer demand of Energy Efficient Vehicles (EEVs). These methodological issues range from how to capture recent purchase decisions in revealed choice studies and how to set up experiments in stated preference (SP) studies, and choice of analysis method for analyzing such data. This paper reviews the plethora of published studies on the field of forecasting demand of EEVs since 1980, and provides a review and annotated bibliography of that literature as it pertains to this particular demand forecasting problem. This detailed review addresses the literature not only to Transportation studies, but specifically to the problem and methodologies around forecasting to the time horizons of planning studies which may represent 10 to 20 year forecasts. The objectives of the paper are to identify where existing gaps in literature exist and to articulate where promising methodologies might guide longer term forecasting. One of the key findings of this review is that there are many common techniques used both in the field of new product demand forecasting and the field of predicting future demand for EEV. Apart from SP and RP methods, some of these new techniques that have emerged in the literature in the last few decades are survey related approaches, product diffusion models, time-series modelling, computational intelligence models and other holistic approaches.

Keywords: demand forecasting, Energy Efficient Vehicles (EEVs), forecasting methodologies review, methodological approaches

Procedia PDF Downloads 489
14551 Behavioral Responses of Coccinella septempunctata and Diaeretiella rapae toward Semiochemicals and Plant Extract

Authors: Muhammad Tariq, Bushra Siddique, Muhammad Naeem, Asim Gulzar

Abstract:

The chemical ecology of natural enemies can play a pivotal role in any Integrated Pest Management (IPM) program. Different chemical cues help to correspond in the diversity of associations between prey and host plant species. Coccinellaseptempunctata and Diaeretiellarapae have the abilities to explore several chemical cues released by plants under herbivore attack that may enhance their efficiency of foraging. In this study, the behavioral responses of Coccinellaseptempunctata and Diaeretiellarapae were examined under the application of two semiochemicals and a plant extract and their combinations using four-arm olfactometer. The bioassay was consists of a pairwise treatment comparison. Data pertaining to the preference of C. septempunctata and D. rapae after treatment application were recorded and analyzed statistically. The mean number of entries and time spent of Coccinellaseptempunctata and D. rapaewere greater in arms treated with E-β-Farnesene. However, the efficacy of E-β-Farnesene was enhanced when combined with β-pinene. Thus, the mean number of entries and time spent of C. septempunctata and D. rapaewere highest in arms treated with the combination of E-β-Farnesene x β-pinene as compared with other treatments. The current work has demonstrated that the insect-derived semiochemicals may enhance the efficacy of natural enemies when applied in combination.

Keywords: olfectometer, parasitoid, predator, preference

Procedia PDF Downloads 145
14550 Verification of Simulated Accumulated Precipitation

Authors: Nato Kutaladze, George Mikuchadze, Giorgi Sokhadze

Abstract:

Precipitation forecasts are one of the most demanding applications in numerical weather prediction (NWP). Georgia, as the whole Caucasian region, is characterized by very complex topography. The country territory is prone to flash floods and mudflows, quantitative precipitation estimation (QPE) and quantitative precipitation forecast (QPF) at any leading time are very important for Georgia. In this study, advanced research weather forecasting model’s skill in QPF is investigated over Georgia’s territory. We have analyzed several convection parameterization and microphysical scheme combinations for different rainy episodes and heavy rainy phenomena. We estimate errors and biases in accumulated 6 h precipitation using different spatial resolution during model performance verification for 12-hour and 24-hour lead time against corresponding rain gouge observations and satellite data. Various statistical parameters have been calculated for the 8-month comparison period, and some skills of model simulation have been evaluated. Our focus is on the formation and organization of convective precipitation systems in a low-mountain region. Several problems in connection with QPF have been identified for mountain regions, which include the overestimation and underestimation of precipitation on the windward and lee side of the mountains, respectively, and a phase error in the diurnal cycle of precipitation leading to the onset of convective precipitation in model forecasts several hours too early.

Keywords: extremal dependence index, false alarm, numerical weather prediction, quantitative precipitation forecasting

Procedia PDF Downloads 147
14549 Research Regarding Resistance Characteristics of Biscuits Assortment Using Cone Penetrometer

Authors: G.–A. Constantin, G. Voicu, E.–M. Stefan, P. Tudor, G. Paraschiv, M.–G. Munteanu

Abstract:

In the activity of handling and transport of food products, the products may be subjected to mechanical stresses that may lead to their deterioration by deformation, breaking, or crushing. This is the case for biscuits, regardless of their type (gluten-free or sugary), the addition of ingredients or flour from which they are made. However, gluten-free biscuits have a higher mechanical resistance to breakage or crushing compared to easily shattered sugar biscuits (especially those for children). The paper presents the results of the experimental evaluation of the texture for four varieties of commercial biscuits, using the penetrometer equipped with needle cone at five different additional weights on the cone-rod. The assortments of biscuits tested in the laboratory were Petit Beurre, Picnic, and Maia (all three manufactured by RoStar, Romania) and Sultani diet biscuits, manufactured by Eti Burcak Sultani (Turkey, in packs of 138 g). For the four varieties of biscuits and the five additional weights (50, 77, 100, 150 and 177 g), the experimental data obtained were subjected to regression analysis in the MS Office Excel program, using Velon's relationship (h = a∙ln(t) + b). The regression curves were analysed comparatively in order to identify possible differences and to highlight the variation of the penetration depth h, in relation to the time t. Based on the penetration depth between two-time intervals (every 5 seconds), the curves of variation of the penetration speed in relation to time were then drawn. It was found that Velon's law verifies the experimental data for all assortments of biscuits and for all five additional weights. The correlation coefficient R2 had in most of the analysed cases values over 0.850. The values recorded for the penetration depth were framed, in general, within 45-55 p.u. (penetrometric units) at an additional mass of 50 g, respectively between 155-168 p.u., at an additional mass of 177 g, at Petit Beurre biscuits. For Sultani diet biscuits, the values of the penetration depth were within the limits of 32-35 p.u., at an additional weight of 50 g and between 80-114 p.u., at an additional weight of 177g. The data presented in the paper can be used by both operators on the manufacturing technology flow, as well as by the traders of these food products, in order to establish the most efficient parametric of the working regimes (when packaging and handling).

Keywords: biscuits resistance/texture, penetration depth, penetration velocity, sharp pin penetrometer

Procedia PDF Downloads 130
14548 Comparison of Seismic Response for Two RC Curved Bridges with Different Column Shapes

Authors: Nina N. Serdar, Jelena R. Pejović

Abstract:

This paper presents seismic risk assessment of two bridge structure, based on the probabilistic performance-based seismic assessment methodology. Both investigated bridges are tree span continuous RC curved bridges with the difference in column shapes. First bridge (type A) has a wall-type pier and second (type B) has a two-column bent with circular columns. Bridges are designed according to European standards: EN 1991-2, EN1992-1-1 and EN 1998-2. Aim of the performed analysis is to compare seismic behavior of these two structures and to detect the influence of column shapes on the seismic response. Seismic risk assessment is carried out by obtaining demand fragility curves. Non-linear model was constructed and time-history analysis was performed using thirty five pairs of horizontal ground motions selected to match site specific hazard. In performance based analysis, peak column drift ratio (CDR) was selected as engineering demand parameter (EDP). For seismic intensity measure (IM) spectral displacement was selected. Demand fragility curves that give probability of exceedance of certain value for chosen EDP were constructed and based on them conclusions were made.

Keywords: RC curved bridge, demand fragility curve, wall type column, nonlinear time-history analysis, circular column

Procedia PDF Downloads 341
14547 Predictability of Kiremt Rainfall Variability over the Northern Highlands of Ethiopia on Dekadal and Monthly Time Scales Using Global Sea Surface Temperature

Authors: Kibrom Hadush

Abstract:

Countries like Ethiopia, whose economy is mainly rain-fed dependent agriculture, are highly vulnerable to climate variability and weather extremes. Sub-seasonal (monthly) and dekadal forecasts are hence critical for crop production and water resource management. Therefore, this paper was conducted to study the predictability and variability of Kiremt rainfall over the northern half of Ethiopia on monthly and dekadal time scales in association with global Sea Surface Temperature (SST) at different lag time. Trends in rainfall have been analyzed on annual, seasonal (Kiremt), monthly, and dekadal (June–September) time scales based on rainfall records of 36 meteorological stations distributed across four homogenous zones of the northern half of Ethiopia for the period 1992–2017. The results from the progressive Mann–Kendall trend test and the Sen’s slope method shows that there is no significant trend in the annual, Kiremt, monthly and dekadal rainfall total at most of the station's studies. Moreover, the rainfall in the study area varies spatially and temporally, and the distribution of the rainfall pattern increases from the northeast rift valley to northwest highlands. Methods of analysis include graphical correlation and multiple linear regression model are employed to investigate the association between the global SSTs and Kiremt rainfall over the homogeneous rainfall zones and to predict monthly and dekadal (June-September) rainfall using SST predictors. The results of this study show that in general, SST in the equatorial Pacific Ocean is the main source of the predictive skill of the Kiremt rainfall variability over the northern half of Ethiopia. The regional SSTs in the Atlantic and the Indian Ocean as well contribute to the Kiremt rainfall variability over the study area. Moreover, the result of the correlation analysis showed that the decline of monthly and dekadal Kiremt rainfall over most of the homogeneous zones of the study area are caused by the corresponding persistent warming of the SST in the eastern and central equatorial Pacific Ocean during the period 1992 - 2017. It is also found that the monthly and dekadal Kiremt rainfall over the northern, northwestern highlands and northeastern lowlands of Ethiopia are positively correlated with the SST in the western equatorial Pacific, eastern and tropical northern the Atlantic Ocean. Furthermore, the SSTs in the western equatorial Pacific and Indian Oceans are positively correlated to the Kiremt season rainfall in the northeastern highlands. Overall, the results showed that the prediction models using combined SSTs at various ocean regions (equatorial and tropical) performed reasonably well in the prediction (With R2 ranging from 30% to 65%) of monthly and dekadal rainfall and recommends it can be used for efficient prediction of Kiremt rainfall over the study area to aid with systematic and informed decision making within the agricultural sector.

Keywords: dekadal, Kiremt rainfall, monthly, Northern Ethiopia, sea surface temperature

Procedia PDF Downloads 141
14546 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution

Authors: Saleem Z. Ramadan

Abstract:

This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the PTH percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.

Keywords: reliability, accelerated life testing, cumulative exposure model, Bayesian estimation, progressive type-I censoring, Weibull distribution

Procedia PDF Downloads 505