Search results for: total floating time (TFT)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24699

Search results for: total floating time (TFT)

17769 Graphic Calculator Effectiveness in Biology Teaching and Learning

Authors: Nik Azmah Nik Yusuff, Faridah Hassan Basri, Rosnidar Mansor

Abstract:

The purpose of the study is to find out the effectiveness of using Graphic calculators (GC) with Calculator Based Laboratory 2 (CBL2) in teaching and learning of form four biology for these topics: Nutrition, Respiration and Dynamic Ecosystem. Sixty form four science stream students were the participants of this study. The participants were divided equally into the treatment and control groups. The treatment group used GC with CBL2 during experiments while the control group used the ordinary conventional laboratory apparatus without using GC with CBL2. Instruments in this study were a set of pre-test and post-test and a questionnaire. T-Test was used to compare the student’s biology achievement while a descriptive statistic was used to analyze the outcome of the questionnaire. The findings of this study indicated the use of GC with CBL2 in biology had significant positive effect. The highest mean was 4.43 for item stating the use of GC with CBL2 had saved collecting experiment result’s time. The second highest mean was 4.10 for item stating GC with CBL2 had saved drawing and labelling graphs. The outcome from the questionnaire also showed that GC with CBL2 were easy to use and save time. Thus, teachers should use GC with CBL2 in support of efforts by Malaysia Ministry of Education in encouraging technology-enhanced lessons.

Keywords: biology experiments, Calculator-Based Laboratory 2 (CBL2), graphic calculators, Malaysia Secondary School, teaching/learning

Procedia PDF Downloads 388
17768 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire

Authors: Vinay A. Sharma, Shiva Prasad H. C.

Abstract:

The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.

Keywords: decision-making, leadership, logic, strategic management

Procedia PDF Downloads 93
17767 Awareness of 'Psychosocial Restraint': A Proper Caring Attitude and Truly Listening to People with Dementia in the Hong Kong’S Residential Care Homes

Authors: Kenny Chi Man Chui

Abstract:

Background: In Chinese culture, the traditional equivalent term for English dementia is chi dai zheng, which, whether translated as ‘insanity’ or ‘idiocy’ carries a sharply negative connotation. In fact, even though the traditional name for dementia has evolved, from chi dai zheng to shi zhi zheng, nao tui hua zheng or ren zhi zhang ai zheng, educating the population about more respectful terms for the condition and promoting a positive understanding about people with dementia in society have proven to be time-intensive endeavors. By extension, the use of such terms promotes the perception that people with dementia undergo a ‘total loss of self’ or experience a ‘living death’ or ‘social death’. Both in Asia and elsewhere, the appropriate nomenclature for dementia remains controversial, and different medical and healthcare professionals in Hong Kong have taken various stances on how to refer to the condition there. Indeed, how this negative perception affects the interaction between people with dementia and the surrounding others? Methodology: Qualitative research with the concept of postmodernism, interpretivism, and Foucauldian theory was adopted as frameworks in applying participatory observations, in-depth interviews, and other qualitative methods. First, ten people with dementia—one man and nine women—living in two residential care homes in Hong Kong were interviewed, as were ten members of the care staff, all of whom were women. Next, to coach the staff in understanding the feelings and self-perceptions of people with dementia, two reflective training sessions were provided. Afterward, to assess the impact of the training sessions on the staff, two focus groups were held. Findings: The findings revealed that residents with dementia did not perceive themselves as being ‘demented’ and were confused by not getting responses from the others. From the understanding of care staff, they perceived the residents as being ‘demented’, desolate troublemakers. They described people with dementia as ‘naughty children’ who should be controlled and be punished while treated them as ‘psychiatric patients’ who could be ignored and be mute. “Psychosocial restraint” happened regarding the discrepancy of perception between people with dementia and the care staff. People with dementia did not think that their confusion of memory was related to dementia or, frankly speaking, they did not know what dementia was. When others treated them as ‘demented patients, the residents with mild to moderate dementia fiercely rejected that designation and reported a host of negative feelings, hence the fluctuations of mood and emotion noted by the care staff. Conclusion: As the findings revealed, the people with dementia were also discontent with the care arrangements in the care homes, felt abandoned by others and worried about bothering others. Their shifting emotional states and moods were treated as the Behavioral and Psychological symptoms of Dementia (BPSD), which nothing can do reported by the care staff in the residential care homes. People with dementia become social withdrawal or isolated in daily living, which should be alert and be changed by the social work professionals about the occurrence of “psychosocial restraint” in dementia care.

Keywords: psychosocial restraint, qualitative research, social work with dementitude, voice of people with dementia

Procedia PDF Downloads 161
17766 Monte Carlo and Biophysics Analysis in a Criminal Trial

Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano

Abstract:

In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.

Keywords: biophysics analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion

Procedia PDF Downloads 116
17765 Current Status of Nitrogen Saturation in the Upper Reaches of the Kanna River, Japan

Authors: Sakura Yoshii, Masakazu Abe, Akihiro Iijima

Abstract:

Nitrogen saturation has become one of the serious issues in the field of forest environment. The watershed protection forests located in the downwind hinterland of Tokyo Metropolitan Area are believed to be facing nitrogen saturation. In this study, we carefully focus on the balance of nitrogen between load and runoff. Annual nitrogen load via atmospheric deposition was estimated to 461.1 t-N/year in the upper reaches of the Kanna River. Annual nitrogen runoff to the forested headwater stream of the Kanna River was determined to 184.9 t-N/year, corresponding to 40.1% of the total nitrogen load. Clear seasonal change in NO3-N concentration was still observed. Therefore, watershed protection forest of the Kanna River is most likely to be in Stage-1 on the status of nitrogen saturation.

Keywords: atmospheric deposition, nitrogen accumulation, denitrification, forest ecosystems

Procedia PDF Downloads 260
17764 Conceptual Study on 4PL and Activities in Turkey

Authors: Berna Kalkan, Kenan Aydin

Abstract:

Companies give importance customer satisfaction to compete the developing and changing market. This is possible when customer reaches the right product, right quality, place, time and cost. In this regard, the extension of logistics services has played active role on formation and development of the different logistics services concept. The concept of logistics services has played important role involved in the healing of economic indicators today. Companies can use logistics providers, thus have competitive advantage and low cost, reducing time, tobe flexibility. In recent years, Fourth Party Logistics (4PL) has emerged as a new concept that includes relationship between suppliers and firms in outsourcing. 4PL provider is an integrator that offers comprehensive supply chain solutions with the technology, resources and capabilities that it possesses. Also, 4PL has attracted as a popular research topic attention in the recent past. In this paper, logistics outsourcing and 4PL concepts are analyzed and a literature review on 4PL activities is given. Also, the previous studies in literature and the approaches that are used in previous studies in literature is presented by analysing on 4PL activities. In this context, a field study will be applied to 4PL providers and service buyer in Turkey. If necessary, results related to this study will be shared in scientific areas.

Keywords: fourth party logistics, literature review, outsourcing, supply chain management

Procedia PDF Downloads 169
17763 Environmental Impact of Gas Field Decommissioning

Authors: Muhammad Ahsan

Abstract:

The effective decommissioning of oil and gas fields and related assets is one of the most important challenges facing the oil and gas industry today and in the future. Decommissioning decisions can no longer be avoided by the operators and the industry as a whole. Decommissioning yields no return on investment and carries significant regulatory liabilities. The main objective of this paper is to provide an approach and mechanism for the estimation of emissions associated with decommissioning of Oil and Gas fields. The model uses gate to gate approach and considers field life from development phase up to asset end life. The model incorporates decommissioning processes which includes; well plugging, plant dismantling, wellhead, and pipeline dismantling, cutting and temporary fabrication, new manufacturing from raw material and recycling of metals. The results of the GHG emissions during decommissioning phase are 2.31x10-2 Kg CO2 Eq. per Mcf of the produced natural gas. Well plug and abandonment evolved to be the most GHG emitting activity with 84.7% of total field decommissioning operational emissions.

Keywords: LCA (life cycle analysis), gas field, decommissioning, emissions

Procedia PDF Downloads 176
17762 Analysis of Brain Specific Creatine Kinase of Postmortem Cerebrospinal Fluid and Serum in Blunt Head Trauma Cases

Authors: Rika Susanti, Eryati Darwin, Dedi Afandi, Yanwirasti, Syahruddin Said, Noverika Windasari, Zelly Dia Rofinda

Abstract:

Introduction: Blunt head trauma is one of the leading causes of death associated with murders and other deaths involved in criminal acts. Creatine kinase (CKBB) levels have been used as a biomarker for blunt head trauma. Therefore, it is now used as an alternative to an autopsy. The aim of this study is to investigate CKBB levels in cerebrospinal fluid (CSF) and post-mortem serum in order to deduce the cause and time of death. Method: This investigation was conducted through post-test–only group design involving deaths caused by blunt head trauma, which was compared to deaths caused by ketamine poisoning. Results: There were eight treatment groups, each consisting of six adult rats (Rattus norvegicus) Sprague-Dawley strain. Examinations were done at 0 hours, 1 hour, 2 hours, and 3 hours post-mortem, which followed by brain tissue observation. Data were then analyzed statistically with a repeated-measures general linear model. Conclusion: There were increases in the level of CKBB in CSF and postmortem serum in both blunt head trauma and ketamine poisoning treatment groups. However, there were no significant differences between these two groups.

Keywords: blunt head trauma, CKBB, the cause of death, estimated time of death

Procedia PDF Downloads 180
17761 Evaluating the Success of an Intervention Course in a South African Engineering Programme

Authors: Alessandra Chiara Maraschin, Estelle Trengove

Abstract:

In South Africa, only 23% of engineering students attain their degrees in the minimum time of 4 years. This begs the question: Why is the 4-year throughput rate so low? Improving the throughput rate is crucial in assisting students to the shortest possible path to completion. The Electrical Engineering programme has a fixed curriculum and students must pass all courses in order to graduate. In South Africa, as is the case in several other countries, many students rely on external funding such as bursaries from companies in industry. If students fail a course, they often lose their bursaries, and most might not be able to fund their 'repeating year' fees. It is thus important to improve the throughput rate, since for many students, graduating from university is a way out of poverty for an entire family. In Electrical Engineering, it has been found that the Software Development I course (an introduction to C++ programming) is a significant hurdle course for students and has been found to have a low pass rate. It has been well-documented that students struggle with this type of course as it introduces a number of new threshold concepts that can be challenging to grasp in a short time frame. In an attempt to mitigate this situation, a part-time night-school for Software Development I was introduced in 2015 as an intervention measure. The course includes all the course material from the Software Development I module and allows students who failed the course in first semester a second chance by repeating the course through taking the night-school course. The purpose of this study is to determine whether the introduction of this intervention course could be considered a success. The success of the intervention is assessed in two ways. The study will first look at whether the night-school course contributed to improving the pass rate of the Software Development I course. Secondly, the study will examine whether the intervention contributed to improving the overall throughput from the 2nd year to the 3rd year of study at a South African University. Second year academic results for a sample of 1216 students have been collected from 2010-2017. Preliminary results show that the lowest pass rate for Software Development I was found to be in 2017 with a pass rate of 34.9%. Since the intervention course's inception, the pass rate for Software Development I has increased each year from 2015-2017 by 13.75%, 25.53% and 25.81% respectively. To conclude, the preliminary results show that the intervention course is a success in improving the pass rate of Software Development I.

Keywords: academic performance, electrical engineering, engineering education, intervention course, low pass rate, software development course, throughput

Procedia PDF Downloads 153
17760 The Association of Vitamin B12 with Body Weight-and Fat-Based Indices in Childhood Obesity

Authors: Mustafa Metin Donma, Orkide Donma

Abstract:

Vitamin deficiencies are common in obese individuals. Particularly, the status of vitamin B12 and its association with vitamin B9 (folate) and vitamin D is under investigation in recent time. Vitamin B12 is closely related to many vital processes in the body. In clinical studies, its involvement in fat metabolism draws attention from the obesity point of view. Obesity, in its advanced stages and in combination with metabolic syndrome (MetS) findings, may be a life-threatening health problem. Pediatric obesity is particularly important because it may be a predictor of severe chronic diseases during the adulthood period of the child. Due to its role in fat metabolism, vitamin B12 deficiency may disrupt metabolic pathways of the lipid and energy metabolisms in the body. The association of low B12 levels with obesity degree may be an interesting topic to be investigated. Obesity indices may be helpful at this point. Weight- and fat-based indices are available. Of them, body mass index (BMI) is in the first group. Fat mass index (FMI), fat-free mass index (FFMI) and diagnostic obesity notation model assessment-II (D2I) index lie in the latter group. The aim of this study is to clarify possible associations between vitamin B12 status and obesity indices in the pediatric population. The study comprises a total of one hundred and twenty-two children. Thirty-two children were included in the normal body mass index (N-BMI) group. Forty-six and forty-four children constitute groups with morbid obese children without MetS and with MetS, respectively. Informed consent forms and the approval of the institutional ethics committee were obtained. Tables prepared for obesity classification by World Health Organization were used. Metabolic syndrome criteria were defined. Anthropometric and blood pressure measurements were taken. Body mass index, FMI, FFMI, D2I were calculated. Routine laboratory tests were performed. Vitamin B9, B12, D concentrations were determined. Statistical evaluation of the study data was performed. Vitamin B9 and vitamin D levels were reduced in MetS group compared to children with N-BMI (p>0.05). Significantly lower values were observed in vitamin B12 concentrations of MetS group (p<0.01). Upon evaluation of blood pressure as well as triglyceride levels, there exist significant increases in morbid obese children. Significantly decreased concentrations of high density lipoprotein cholesterol were observed. All of the obesity indices and insulin resistance index exhibit increasing tendency with the severity of obesity. Inverse correlations were calculated between vitamin D and insulin resistance index as well as vitamin B12 and D2I in morbid obese groups. In conclusion, a fat-based index, D2I, was the most prominent body index, which shows a strong correlation with vitamin B12 concentrations in the late stage of obesity in children. A negative correlation between these two parameters was a confirmative finding related to the association between vitamin B12 and obesity degree.

Keywords: body mass index, children, D2I index, fat mass index, obesity

Procedia PDF Downloads 185
17759 The Effect of Zeolite and Fertilizers on Yield and Qualitative Characteristics of Cabbage in the Southeast of Kazakhstan

Authors: Tursunay Vassilina, Aigerim Shibikeyeva, Adilet Sakhbek

Abstract:

Research has been carried out to study the influence of modified zeolite fertilizers on the quantitative and qualitative indicators of cabbage variety Nezhenka. The use of zeolite and mineral fertilizers had a positive effect on both the yield and quality indicators of the studied crop. The maximum increase in yield from fertilizers was 16.5 t/ha. Application of both zeolite and fertilizer increased the dry matter, sugar and vitamin C content of cabbage heads. It was established that the cabbage contains an amount of nitrates that is safe for human health. Among vegetable crops, cabbage has both food and feed value. One of the limiting factors in the sale of vegetable crops is the degradation of soil fertility due to depletion of nutrient reserves and erosion processes, and non-compliance with fertilizer application technologies. Natural zeolites are used as additives to mineral fertilizers for application in the field, which makes it possible to reduce their doses to minimal quantities. Zeolites improve the agrophysical and agrochemical properties of the soil and the quality of plant products. The research was carried out in a field experiment, carried out in 3 repetitions, on dark chestnut soil in 2023. The soil (pH = 7.2-7.3) of the experimental plot is dark chestnut, the humus content in the arable layer is 2.15%, gross nitrogen 0.098%, phosphorus, potassium 0.225 and 2.4%, respectively. The object of the study was the late cabbage variety Nezhenka. Scheme for applying fertilizers to cabbage: 1. Control (without fertilizers); 2. Zeolite 2t/ha; 3. N45P45K45; 4. N90P90K90; 5. Zeolite, 2 t/ha + N45P45K45; 6. Zeolite, 2 t/ha + N90P90K90. Yield accounting was carried out on a plot-by-plot basis manually. In plant samples, the following was determined: dry matter content by thermostatic method (at 105ºC); sugar content by Bertrand titration method, nitrate content by 1% diphenylamine solution, vitamin C by titrimetric method with acid solution. According to the results, it was established that the yield of cabbage was high – 42.2 t/ha in the treatment Zeolite, 2 t/ha + N90P90K90. When determining the biochemical composition of white cabbage, it was found that the dry matter content was 9.5% and increased with fertilized treatments. The total sugar content increased slightly with the use of zeolite (5.1%) and modified zeolite fertilizer (5.5%), the vitamin C content ranged from 17.5 to 18.16%, while in the control, it was 17.21%. The amount of nitrates in products also increased with increasing doses of nitrogen fertilizers and decreased with the use of zeolite and modified zeolite fertilizer but did not exceed the maximum permissible concentration. Based on the research conducted, it can be concluded that the application of zeolite and fertilizers leads to a significant increase in yield compared to the unfertilized treatment; contribute to the production of cabbage with good and high quality indicators.

Keywords: cabbage, dry matter, nitrates, total sugar, yield, vitamin C

Procedia PDF Downloads 59
17758 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning

Authors: Tanvi P. Patel, Warish D. Patel

Abstract:

Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.

Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern

Procedia PDF Downloads 355
17757 A Double Epilayer PSGT Trench Power MOSFETs for Low to Medium Voltage Power Applications

Authors: Alok Kumar Kamal, Vinod Kumar

Abstract:

The trench gate MOSFET has shown itself as the most appropriate power device for low to medium voltage power applications due to its lowest possible ON resistance among all power semiconductor devices. In this research work a double-epilayer PSGT structure using a thin layer of N+ polysilicon as gate material. The total ON-state resistance (RON) of UMOSFET can be reduced by optimizing the epilayer thickness. The optimized structure of Double-Epilayer exhibits a 25.8% reduction in the ON-state resistance at Vgs=5V and improving the switching characteristics by reducing the Reverse transfer capacitance (Cgd) by 7.4%.

Keywords: Miller-capacitance, double-Epilayer;switching characteristics, power trench MOSFET (U-MOSFET), on-state resistance, blocking voltage

Procedia PDF Downloads 57
17756 Socio-Technical Systems: Transforming Theory into Practice

Authors: L. Ngowi, N. H. Mvungi

Abstract:

This paper critically examines the evolution of socio-technical systems theory, its practices, and challenges in system design and development. It examines concepts put forward by researchers focusing on the application of the theory in software engineering. There are various methods developed that use socio-technical concepts based on systems engineering without remarkable success. The main constraint is the large amount of data and inefficient techniques used in the application of the concepts in system engineering for developing time-bound systems and within a limited/controlled budget. This paper critically examines each of the methods, highlight bottlenecks and suggest the way forward. Since socio-technical systems theory only explains what to do, but not how doing it, hence engineers are not using the concept to save time, costs and reduce risks associated with new frameworks. Hence, a new framework, which can be considered as a practical approach is proposed that borrows concepts from soft systems method, agile systems development and object-oriented analysis and design to bridge the gap between theory and practice. The approach will enable the development of systems using socio-technical systems theory to attract/enable the system engineers/software developers to use socio-technical systems theory in building worthwhile information systems to avoid fragilities and hostilities in the work environment.

Keywords: socio-technical systems, human centered design, software engineering, cognitive engineering, soft systems, systems engineering

Procedia PDF Downloads 270
17755 An Automatic Large Classroom Attendance Conceptual Model Using Face Counting

Authors: Sirajdin Olagoke Adeshina, Haidi Ibrahim, Akeem Salawu

Abstract:

large lecture theatres cannot be covered by a single camera but rather by a multicamera setup because of their size, shape, and seating arrangements. Although, classroom capture is achievable through a single camera. Therefore, a design and implementation of a multicamera setup for a large lecture hall were considered. Researchers have shown emphasis on the impact of class attendance taken on the academic performance of students. However, the traditional method of carrying out this exercise is below standard, especially for large lecture theatres, because of the student population, the time required, sophistication, exhaustiveness, and manipulative influence. An automated large classroom attendance system is, therefore, imperative. The common approach in this system is face detection and recognition, where known student faces are captured and stored for recognition purposes. This approach will require constant face database updates due to constant changes in the facial features. Alternatively, face counting can be performed by cropping the localized faces on the video or image into a folder and then count them. This research aims to develop a face localization-based approach to detect student faces in classroom images captured using a multicamera setup. A selected Haar-like feature cascade face detector trained with an asymmetric goal to minimize the False Rejection Rate (FRR) relative to the False Acceptance Rate (FAR) was applied on Raspberry Pi 4B. A relationship between the two factors (FRR and FAR) was established using a constant (λ) as a trade-off between the two factors for automatic adjustment during training. An evaluation of the proposed approach and the conventional AdaBoost on classroom datasets shows an improvement of 8% TPR (output result of low FRR) and 7% minimization of the FRR. The average learning speed of the proposed approach was improved with 1.19s execution time per image compared to 2.38s of the improved AdaBoost. Consequently, the proposed approach achieved 97% TPR with an overhead constraint time of 22.9s compared to 46.7s of the improved Adaboost when evaluated on images obtained from a large lecture hall (DK5) USM.

Keywords: automatic attendance, face detection, haar-like cascade, manual attendance

Procedia PDF Downloads 60
17754 Filtering Momentum Life Cycles, Price Acceleration Signals and Trend Reversals for Stocks, Credit Derivatives and Bonds

Authors: Periklis Brakatsoulas

Abstract:

Recent empirical research shows a growing interest in investment decision-making under market anomalies that contradict the rational paradigm. Momentum is undoubtedly one of the most robust anomalies in the empirical asset pricing research and remains surprisingly lucrative ever since first documented. Although predominantly phenomena identified across equities, momentum premia are now evident across various asset classes. Yet few many attempts are made so far to provide traders a diversified portfolio of strategies across different assets and markets. Moreover, literature focuses on patterns from past returns rather than mechanisms to signal future price directions prior to momentum runs. The aim of this paper is to develop a diversified portfolio approach to price distortion signals using daily position data on stocks, credit derivatives, and bonds. An algorithm allocates assets periodically, and new investment tactics take over upon price momentum signals and across different ranking groups. We focus on momentum life cycles, trend reversals, and price acceleration signals. The main effort here concentrates on the density, time span and maturity of momentum phenomena to identify consistent patterns over time and measure the predictive power of buy-sell signals generated by these anomalies. To tackle this, we propose a two-stage modelling process. First, we generate forecasts on core macroeconomic drivers. Secondly, satellite models generate market risk forecasts using the core driver projections generated at the first stage as input. Moreover, using a combination of the ARFIMA and FIGARCH models, we examine the dependence of consecutive observations across time and portfolio assets since long memory behavior in volatilities of one market appears to trigger persistent volatility patterns across other markets. We believe that this is the first work that employs evidence of volatility transmissions among derivatives, equities, and bonds to identify momentum life cycle patterns.

Keywords: forecasting, long memory, momentum, returns

Procedia PDF Downloads 90
17753 The Droplet Generation and Flow in the T-Shape Microchannel with the Side Wall Fluctuation

Authors: Yan Pang, Xiang Wang, Zhaomiao Liu

Abstract:

Droplet microfluidics, in which nanoliter to picoliter droplets acted as individual compartments, are common to a diverse array of applications such as analytical chemistry, tissue engineering, microbiology and drug discovery. The droplet generation in a simplified two dimension T-shape microchannel with the main channel width of 50 μm and the side channel width of 25 μm, is simulated to investigate effects of the forced fluctuation of the side wall on the droplet generation and flow. The periodic fluctuations are applied on a length of the side wall in the main channel of the T-junction with the deformation shape of the double-clamped beam acted by the uniform force, which varies with the flow time and fluctuation periods, forms and positions. The fluctuations under most of the conditions expand the distribution range of the droplet size but have a little effect on the average size, while the shape of the fixed side wall changes the average droplet size chiefly. Droplet sizes show a periodic pattern along the relative time when the fluctuation is forced on the side wall near the T-junction. The droplet emerging frequency is not varied by the fluctuation of the side wall under the same flow rate and geometry conditions. When the fluctuation period is similar with the droplet emerging period, the droplet size shows a nice stability as the no fluctuation case.

Keywords: droplet generation, droplet size, flow flied, forced fluctuation

Procedia PDF Downloads 269
17752 Exploring Fertility Dynamics in the MENA Region: Distribution, Determinants, and Temporal Trends

Authors: Dena Alhaloul

Abstract:

The Middle East and North Africa (MENA) region is characterized by diverse cultures, economies, and social structures. Fertility rates in MENA have seen significant changes over time, with variations among countries and subregions. Understanding fertility patterns in this region is essential due to its impact on demographic dynamics, healthcare, labor markets, and social policies. Rising or declining fertility rates have far-reaching consequences for the region's socioeconomic development. The main thrust of this study is to comprehensively examine fertility rates in the Middle East and North Africa (MENA) region. It aims to understand the distribution, determinants, and temporal trends of fertility rates in MENA countries. The study seeks to provide insights into the factors influencing fertility decisions, assess how fertility rates have evolved over time, and potentially develop statistical models to characterize these trends. As for the methodology of the study, the study uses descriptive statistics to summarize and visualize fertility rate data. It also uses regression analyses to identify determinants of fertility rates as well as statistical modeling to characterize temporal trends in fertility rates. The conclusion of this study The research will contribute to a deeper understanding of fertility dynamics in the MENA region, shedding light on the distribution of fertility rates, their determinants, and historical trends.

Keywords: fertility, distribution, modeling, regression

Procedia PDF Downloads 58
17751 Effect of Nitriding and Shot Peening on Corrosion Behavior and Surface Properties of Austenite Stainless Steel 316L

Authors: Khiaira S. Hassan, Abbas S. Alwan, Muna K. Abbass

Abstract:

This research aims to study the effect of the liquid nitriding and shot peening on the hardness, surface roughness, residual stress, microstructure and corrosion behavior of austenite stainless steel 316 L. Chemical surface heat treatment by liquid nitriding process was carried out at 500 °C for 1 h and followed by shot peening with using ball steel diameter of 1.25 mm in different exposure time of 10 and 20 min. Electrochemical corrosion test was applied in sea water (3.5% NaCl solution) by using potentostat instrument. The results showed that the nitride layer consists of a compound layer (white layer) and diffusion zone immediately below the alloy layer. It has been found that the mechanical treatment (shot peening) has led to the formation of compressive residual stresses in layer surface that increased the hardness of stainless steel surface. All surface treatment (nitriding and shot peening) processes have led to the formation of carbide of CrN in hard surface layer. It was shown that both processes caused an increase in surface hardness and roughness which increases with shot peening time. Also, the corrosion results showed that the liquid nitriding and shot peening processes increase the corrosion rate to values more than that of not treated stainless steel.

Keywords: stainless steel 316L, shot peening, nitriding, corrosion, hardness

Procedia PDF Downloads 454
17750 An Investigation on Engineering Students’ Perceptions Towards E-learning in the UK

Authors: Vida Razzaghifard

Abstract:

E-learning, also known as online learning, has indicated an increased growth in recent years. One of the critical factors in the successful application of e-learning in higher education is students’ perceptions towards it. The main purpose of this paper is to investigate the perceptions of engineering students about e-learning in UK. For the purpose of the present study, 145 second year Engineering students were randomly selected from the total population of 1280 participants. The participants were asked to complete a questionnaire containing 16 items. The data collected from the questionnaire were analyzed through the Statistical Package for Social Science (SPSS) software. The findings of the study revealed that the majority of participants have negative perceptions on e-learning. Most of the students had trouble interacting effectively during online classes. Furthermore, the majority of participants had negative experiences with the learning platform they used during e-learning. Suggestions were made on what could be done to improve the students’ perceptions towards e-learning.

Keywords: E-learning, higher, education, engineering education, online learning

Procedia PDF Downloads 84
17749 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 134
17748 Deformed Wing Virus and Varroa Destructor in the Local Honey Bee Colonies Apis mellifera intermissa in Algeria

Authors: Noureddine Adjlane, Nizar Haddad

Abstract:

Deformed Wing Virus (DWV) is considered as the most prevalent virus that dangerous the honeybee health worldwide today. In this study we aimed to evaluate the impact of the virus on honeybees (Apis mellifera intermissa) mortality in Algeria and we conducted the study on samples collected from the central area in the country. We used PCR for the diagnoses of the (DWV) in the diagnosis. The results had shown a high infestation in the sampled colonies and it represented 42% of the total sample. In this study, we found a clear role of both Varroa destructor mite and DWV on hive mortality in the experimented apiary. Further studies need to be conducted in order to give soled recommendations to the beekeepers, decision makers and stockholders of the Algerian beekeeping sector.

Keywords: honey bee, DWV, Varroa destructor, mortality, prevalence, infestation

Procedia PDF Downloads 438
17747 Optimization of Pregelatinized Taro Boloso-I Starch as a Direct Compression Tablet Excipient

Authors: Tamrat Balcha Balla

Abstract:

Background: Tablets are still the most preferred means of drug delivery. The search for new and improved direct compression tablet excipients is an area of research focus. Taro Boloso-I is a variety of Colocasia esculenta (L. Schott) yielding 67% more than the other varieties (Godare) in Ethiopia. This study aimed to enhance the flowability while keeping the compressibility and compactibility of the pregelatinized Taro Boloso-I starch. Methods: Central composite design was used for the optimization of two factors which were the temperature and duration of pregelatinization against 5 responses. The responses were angle of repose, Hausner ratio, Kawakita compressibility index, mean yield pressure and tablet breaking force. Results and Discussions: An increase in both temperature and time resulted in decrease in the angle of repose. The increase in temperature was shown to decrease the Hausner ratio and to decrease the Kawakita compressibility index. The mean yield pressure was observed to increase with increasing levels of both temperature and time. The pregelatinized (optimized) Taro Boloso-I starch could show desired flow property and compressibility. Conclusions: Pregelatinized Taro Boloso - I starch could be regarded as a potential direct compression excipient in terms of flowability, compressibility and compactibility.

Keywords: starch, compression, pregelatinization, Taro Boloso-I

Procedia PDF Downloads 91
17746 Improving the Performance of Road Salt on Anti-Icing

Authors: Mohsen Abotalebi Esfahani, Amin Rahimi

Abstract:

Maintenance and management of route and roads infrastructure is one of the most important and the most fundamental principles of the countries. Several methods have been under investigation as preventive proceedings for the maintenance of asphalt pavements for many years. Using a mixture of salt, sand and gravel is the most common method of deicing, which could have numerous harmful consequences. Icy or snow-covered road is one of the major reasons of accidents in rainy seasons, which causes substantial damages such as loss of time and energy, environmental pollution, destruction of buildings, traffic congestion and rising possibility of accidents. Regarding this, every year the government incurred enormous costs to secure traverses. In this study, asphalt pavements have been cured, in terms of compressive strength, tensile strength and resilient modulus of asphalt samples, under the influence of Magnesium Chloride, Calcium Chloride, Sodium Chloride, Urea and pure water; and showed that de-icing with the calcium chloride solution and urea have the minimum negative effect and de-icing with pure water has most negative effect on laboratory specimens. Hence some simple techniques and new equipment and less use of sand and salt, can reduce significantly the risks and harmful effects of excessive use of salt, sand and gravel and at the same time use the safer roads.

Keywords: maintenance, sodium chloride, icyroad, calcium chloride

Procedia PDF Downloads 267
17745 Excess Body Fat as a Store Toxin Affecting the Glomerular Filtration and Excretory Function of the Liver in Patients after Renal Transplantation

Authors: Magdalena B. Kaziuk, Waldemar Kosiba, Marek J. Kuzniewski

Abstract:

Introduction: Adipose tissue is a typical place for storage water-insoluble toxins in the body. It's connective tissue, where the intercellular substance consist of fat, which level in people with low physical activity should be 18-25% for women and 13-18% for men. Due to the fat distribution in the body we distinquish two types of obesity: android (visceral, abdominal) and gynoidal (gluteal-femoral, peripheral). Abdominal obesity increases the risk of complications of the cardiovascular system diseases, and impaired renal and liver function. Through the influence on disorders of metabolism, lipid metabolism, diabetes and hypertension, leading to emergence of the metabolic syndrome. So thus, obesity will especially overload kidney function in patients after transplantation. Aim: An attempt was made to estimate the impact of amount fat tissue on transplanted kidney function and excretory function of the liver in patients after Ktx. Material and Methods: The study included 108 patients (50 females, 58 male, age 46.5 +/- 12.9 years) with active kidney transplant after more than 3 months from the transplantation. An analysis of body composition was done by using electrical bioimpedance (BIA) and anthropometric measurements. Estimated basal metabolic rate (BMR), muscle mass, total body water content and the amount of body fat. Information about physical activity were obtained during clinical examination. Nutritional status, and type of obesity were determined by using indicators: Waist to Height Ratio (WHR) and Waist to Hip Ratio (WHR). Excretory functions of the transplanted kidney was rated by calculating the estimated renal glomerular filtration rate (eGFR) using the MDRD formula. Liver function was rated by total bilirubin and alanine aminotransferase levels ALT concentration in serum. In our patients haemolitic uremic syndrome (HUS) was excluded. Results: In 19.44% of patients had underweight, 22.37% of the respondents were with normal weight, 11.11% had overweight, and the rest were with obese (49.08%). People with android stature have a lower eGFR compared with those with the gynoidal stature (p = 0.004). All patients with obesity had higher amount of body fat from a few to several percent. The higher amount of body fat percentage, the lower eGFR had patients (p <0.001). Elevated ALT levels significantly correlated with a high fat content (p <0.02). Conclusion: Increased amount of body fat, particularly in the case of android obesity can be a predictor of kidney and liver damage. Due to that obese patients should have more frequent control of diagnostic functions of these organs and the intensive dietary proceedings, pharmacological and regular physical activity adapted to the current physical condition of patients after transplantation.

Keywords: obesity, body fat, kidney transplantation, glomerular filtration rate, liver function

Procedia PDF Downloads 446
17744 Reasons for Language Words in the Quran and Literary Approaches That Are Persian

Authors: Fateme Mazbanpoor, Sayed Mohammad Amiri

Abstract:

In this article, we will examine the Persian words in Quran and study the reasons of their presence in this holy book. Writers of this paper extracted about 70 Persian words of Quran by referring to resources. (Alalfaz ol Moarab ol Farsieh Edishir, Almoarabol Javalighi, Almahzab va Etghan Seuti; Vocabulary involved in Quran Arthur Jeffry;, and etc…), some of these words are: ‘Abarigh, ‘Estabragh’,’Barzakh’, ‘Din’,’Zamharir, ‘Sondos’ ‘Sejil’,’ Namaregh’, ‘Fil’ etc. These Persian words have entered Arabic and finally entered Quran in two ways: 1) directly from Persian language, 2) via other languages. The first way: because of the Iranian dominance on Hira, Yemen, whole Oman and Bahrein land in Sasanian period, there were political, religious, linguistic, literary, and trade ties between these Arab territories causing the impact of Persian on Arabic; giving way to many Persian-loan words into Arabic in this period of time. The second way: Since the geographical and business conditions of the areas were dominated by Iran, Hejaz had lots of deals and trades with Mesopotamia and Yemen. On the other hand, Arabic language which was relatively a young language at that time, used to be impressed by Semitic languages in order to expand its vocabulary (Syrian and Aramaic were influenced by the languages of Iran). Consequently, due to the long relationship between Iranian and Arabs, some of the Persian words have taken longer ways through Aramaic and Syrian to find their way into Quran.

Keywords: Quran, Persian word, Arabic language, Persian

Procedia PDF Downloads 449
17743 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators

Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros

Abstract:

Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.

Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis

Procedia PDF Downloads 121
17742 Investigating the Relationship between Job Satisfaction, Role Identity, and Turnover Intention for Nurses in Outpatient Department

Authors: Su Hui Tsai, Weir Sen Lin, Rhay Hung Weng

Abstract:

There are numerous outpatient departments at hospitals with enormous amounts of outpatients. Although the work of outpatient nursing staff does not include the ward, emergency and critical care units that involve patient life-threatening conditions, the work is cumbersome and requires facing and dealing with a large number of outpatients in a short period of time. Therefore, nursing staff often do not feel satisfied with their work and cannot identify with their professional role, leading to intentions to leave their job. Thus, the main purpose of this study is to explore the correlation between the job satisfaction and role identity of nursing staff with turnover intention. This research was conducted using a questionnaire, and the subjects were outpatient nursing staff in three regional hospitals in Southern Taiwan. A total of 175 questionnaires were distributed, and 166 valid questionnaires were returned. After collecting the data, the reliability and validity of the study variables were confirmed by confirmatory factor analysis. The influence of role identity and job satisfaction on nursing staff’s turnover intention was analyzed by descriptive analysis, one-way ANOVA, Pearson correlation analysis and multiple regression analysis. Results showed that 'role identity' had significant differences in different types of marriages. Job satisfaction of 'grasp of environment' had significant differences in different levels of education. Job satisfaction of 'professional growth' and 'shifts and days off' showed significant differences in different types of marriages. 'Role identity' and 'job satisfaction' were negatively correlated with turnover intention respectively. Job satisfaction of 'salary and benefits' and 'grasp of environment' were significant predictors of role identity. The higher the job satisfaction of 'salary and benefits' and 'grasp of environment', the higher the role identity. Job satisfaction of 'patient and family interaction' were significant predictors of turnover intention. The lower the job satisfaction of 'patient and family interaction', the higher the turnover intention. This study found that outpatient nursing staff had the lowest satisfaction towards salary structure. It is recommended that bonuses, promotion opportunities and other incentives be established to increase the role identity of outpatient nursing staff. The results showed that the higher the job satisfaction of 'salary and benefits' and 'grasp of environment', the higher the role identity. It is recommended that regular evaluations be conducted to reward nursing staff with excellent service and invite nursing staff to share their work experiences and thoughts, to enhance nursing staff’s expectation and identification of their occupational role, as well as instilling the concept of organizational service and organizational expectations of emotional display. The results showed that the lower the job satisfaction of 'patient and family interaction', the higher the turnover intention. It is recommended that interpersonal communication and workplace violence prevention educational training courses be organized to enhance the communication and interaction of nursing staff with patients and their families.

Keywords: outpatient, job satisfaction, turnover, intention

Procedia PDF Downloads 138
17741 Measurement of Greenhouse Gas Emissions from Sugarcane Plantation Soil in Thailand

Authors: Wilaiwan Sornpoon, Sébastien Bonnet, Savitri Garivait

Abstract:

Continuous measurements of greenhouse gases (GHGs) emitted from soils are required to understand diurnal and seasonal variations in soil emissions and related mechanism. This understanding plays an important role in appropriate quantification and assessment of the overall change in soil carbon flow and budget. This study proposes to monitor GHGs emissions from soil under sugarcane cultivation in Thailand. The measurements were conducted over 379 days. The results showed that the total net amount of GHGs emitted from sugarcane plantation soil amounts to 36 Mg CO2eq ha-1. Carbon dioxide (CO2) and nitrous oxide (N2O) were found to be the main contributors to the emissions. For methane (CH4), the net emission was found to be almost zero. The measurement results also confirmed that soil moisture content and GHGs emissions are positively correlated.

Keywords: soil, GHG emission, sugarcane, agriculture, Thailand

Procedia PDF Downloads 412
17740 An Absolute Femtosecond Rangefinder for Metrological Support in Coordinate Measurements

Authors: Denis A. Sokolov, Andrey V. Mazurkevich

Abstract:

In the modern world, there is an increasing demand for highly precise measurements in various fields, such as aircraft, shipbuilding, and rocket engineering. This has resulted in the development of appropriate measuring instruments that are capable of measuring the coordinates of objects within a range of up to 100 meters, with an accuracy of up to one micron. The calibration process for such optoelectronic measuring devices (trackers and total stations) involves comparing the measurement results from these devices to a reference measurement based on a linear or spatial basis. The reference used in such measurements could be a reference base or a reference range finder with the capability to measure angle increments (EDM). The base would serve as a set of reference points for this purpose. The concept of the EDM for replicating the unit of measurement has been implemented on a mobile platform, which allows for angular changes in the direction of laser radiation in two planes. To determine the distance to an object, a high-precision interferometer with its own design is employed. The laser radiation travels to the corner reflectors, which form a spatial reference with precisely known positions. When the femtosecond pulses from the reference arm and the measuring arm coincide, an interference signal is created, repeating at the frequency of the laser pulses. The distance between reference points determined by interference signals is calculated in accordance with recommendations from the International Bureau of Weights and Measures for the indirect measurement of time of light passage according to the definition of a meter. This distance is D/2 = c/2nF, approximately 2.5 meters, where c is the speed of light in a vacuum, n is the refractive index of a medium, and F is the frequency of femtosecond pulse repetition. The achieved uncertainty of type A measurement of the distance to reflectors 64 m (N•D/2, where N is an integer) away and spaced apart relative to each other at a distance of 1 m does not exceed 5 microns. The angular uncertainty is calculated theoretically since standard high-precision ring encoders will be used and are not a focus of research in this study. The Type B uncertainty components are not taken into account either, as the components that contribute most do not depend on the selected coordinate measuring method. This technology is being explored in the context of laboratory applications under controlled environmental conditions, where it is possible to achieve an advantage in terms of accuracy. In general, the EDM tests showed high accuracy, and theoretical calculations and experimental studies on an EDM prototype have shown that the uncertainty type A of distance measurements to reflectors can be less than 1 micrometer. The results of this research will be utilized to develop a highly accurate mobile absolute range finder designed for the calibration of high-precision laser trackers and laser rangefinders, as well as other equipment, using a 64 meter laboratory comparator as a reference.

Keywords: femtosecond laser, pulse correlation, interferometer, laser absolute range finder, coordinate measurement

Procedia PDF Downloads 37