Search results for: mean square error (MSE)
659 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data
Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar
Abstract:
It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.Keywords: accuracy, exponential smoothing, forecasting, initial value
Procedia PDF Downloads 177658 Surveillance of Adverse Events Following Immunization during New Vaccines Introduction in Cameroon: A Cross-Sectional Study on the Role of Mobile Technology
Authors: Andreas Ateke Njoh, Shalom Tchokfe Ndoula, Amani Adidja, Germain Nguessan Menan, Annie Mengue, Eric Mboke, Hassan Ben Bachir, Sangwe Clovis Nchinjoh, Yauba Saidu, Laurent Cleenewerck De Kiev
Abstract:
Vaccines serve a great deal in protecting the population globally. Vaccine products are subject to rigorous quality control and approval before use to ensure safety. Even if all actors take the required precautions, some people could still have adverse events following immunization (AEFI) caused by the vaccine composition or an error in its administration. AEFI underreporting is pronounced in low-income settings like Cameroon. The Country introduced electronic platforms to strengthen surveillance. With the introduction of many novel vaccines, like COVID-19 and the novel Oral Polio Vaccine (nOPV) 2, there was a need to monitor AEFI in the Country. A cross-sectional study was conducted from July to December 2022. Data on AEFI per region of Cameroon were reviewed for the past five years. Data were analyzed with MS Excel, and the results were presented in proportions. AEFI reporting was uncommon in Cameroon. With the introduction of novel vaccines in 2021, the health authorities engaged in new tools and training to capture cases. AEFI detected almost doubled using the open data kit (ODK) compared to previous platforms, especially following the introduction of the nOPV2 and COVID-19 vaccines. The AEFI rate was 1.9 and 160 per administered 100 000 doses of nOPV2 and COVID-19 vaccines, respectively. This mobile tool captured individual information for people with AEFI from all regions. The platform helped to identify common AEFI following the use of these new vaccines. The ODK mobile technology was vital in improving AEFI reporting and providing data to monitor using new vaccines in Cameroon.Keywords: adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK
Procedia PDF Downloads 88657 Creating Database and Building 3D Geological Models: A Case Study on Bac Ai Pumped Storage Hydropower Project
Authors: Nguyen Chi Quang, Nguyen Duong Tri Nguyen
Abstract:
This article is the first step to research and outline the structure of the geotechnical database in the geological survey of a power project; in the context of this report creating the database that has been carried out for the Bac Ai pumped storage hydropower project. For the purpose of providing a method of organizing and storing geological and topographic survey data and experimental results in a spatial database, the RockWorks software is used to bring optimal efficiency in the process of exploiting, using, and analyzing data in service of the design work in the power engineering consulting. Three-dimensional (3D) geotechnical models are created from the survey data: such as stratigraphy, lithology, porosity, etc. The results of the 3D geotechnical model in the case of Bac Ai pumped storage hydropower project include six closely stacked stratigraphic formations by Horizons method, whereas modeling of engineering geological parameters is performed by geostatistical methods. The accuracy and reliability assessments are tested through error statistics, empirical evaluation, and expert methods. The three-dimensional model analysis allows better visualization of volumetric calculations, excavation and backfilling of the lake area, tunneling of power pipelines, and calculation of on-site construction material reserves. In general, the application of engineering geological modeling makes the design work more intuitive and comprehensive, helping construction designers better identify and offer the most optimal design solutions for the project. The database always ensures the update and synchronization, as well as enables 3D modeling of geological and topographic data to integrate with the designed data according to the building information modeling. This is also the base platform for BIM & GIS integration.Keywords: database, engineering geology, 3D Model, RockWorks, Bac Ai pumped storage hydropower project
Procedia PDF Downloads 167656 Non-Governmental Organisations and Human Development in Bauchi State, Nigeria
Authors: Sadeeq Launi
Abstract:
NGOs, the world over, have been recognized as part of the institutions that complement government activities in providing services to the people, particularly in respect of human development. This study examined the role played by the NGOs in human development in Bauchi State, Nigeria, between 2004 and 2013. The emphasis was on reproductive health and access to education role of the selected NGOs. All the research questions, objectives and hypotheses were stated in line with these variables. The theoretical framework that guided the study was the participatory development approach. Being a survey research, data were generated from both primary and secondary sources with questionnaires and interviews as the instruments for generating the primary data. The population of the study was made up of the staff of the selected NGOs, beneficiaries, health staff and school teachers in Bauchi State. The sample drawn from these categories were 90, 107 and 148 units respectively. Stratified random and simple random sampling techniques were adopted for NGOs staff, and Health staff and school teachers data were analyzed quantitatively and qualitatively and hypotheses were tested using Pearson Chi-square test through SPSS computer statistical package. The study revealed that despite the challenges facing NGOs operations in the study area, NGOs rendered services in the areas of health and education This research recommends among others that, both government and people should be more cooperative to NGOs to enable them provide more efficient and effective services. Governments at all levels should be more dedicated to increasing accessibility and affordability of basic education and reproductive health care facilities and services in Bauchi state through committing more resources to the Health and Education sectors, this would support and facilitate the complementary role of NGOs in providing teaching facilities, drugs, and other reproductive health services in the States. More enlightenment campaigns should be carried out by governments to sensitize the public, particularly women on the need to embrace immunization programmes for their children and antenatal care services being provided by both the government and NGOs.Keywords: access to education, human development, NGOs, reproductive health
Procedia PDF Downloads 176655 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 136654 Investigation p53 Codon 72 Polymorphism and miR-146a rs2910164 Polymorphism in Breast Cancer
Authors: Marjan Moradi Fard, Hossein Rassi, Masoud Houshmand
Abstract:
Aim: Breast cancer is one of the most common cancers affecting the morbidity and mortality of Iranian women. This disease is a result of collective alterations of oncogenes and tumor suppressor genes. Studies have produced conflicting results concerning the role of p53 codon 72 polymorphism (G>C) and miR-146a rs2910164 polymorphism (G>C) on the risk of several cancers; therefore, a research was performed to estimate the association between the p53 codon 72 polymorphism and miR-146a rs2910164 polymorphism in breast cancer. Methods and Materials: A total of 45 archival breast cancer samples from khatam hospital and 40 healthy samples were collected. Verification of each cancer reported in a relative was sought through the pathology reports of the hospital records. Then, DNA extracted from all samples by standard methods and p53 codon 72 polymorphism genotypes and miR-146a rs2910164 polymorphism genotypes were analyzed using multiplex PCR. The tubules, mitotic activity, necrosis, polymorphism and grade of breast cancer were staged by Nottingham histological grading and immunohistochemical staining of the sections from the paraffin wax embedded tissues for the expression of ER, PR and p53 was carried out using a standard method. Finally, data analysis was performed using the 7 version of the Epi Info(TM) 2012 software and test chi-square(x2) for trend. Results: Successful DNA extraction was assessed by PCR amplification of b-actin gene (99 bp). According to the results, p53 GG genotype and miR-146a rs2910164 CC genotype was significantly associated with increased risk of breast cancer in the study population. In this study, we established that tumors of p53 GG genotype and miR-146a rs2910164 CC genotype exhibited higher mitotic activity, higher polymorphism, lower necrosis, lower tubules, higher ER- and PR-negatives and lower TP53-positives than the other genotypes. Conclusion: The present study provided preliminary evidence that a p53 GG genotype may effect breast cancer risk in the study population, interacting synergistically with miR-146a rs2910164 CC genotype. Our results demonstrate that the testing of p53 codon 72 polymorphism genotypes and miR-146a rs2910164 polymorphism genotypes in combination with clinical parameters can serve as major risk factors in the early identification of breast cancers.Keywords: breast cancer, p53 codon 72 polymorphism, miR-146a rs2910164 polymorphism, genotypes
Procedia PDF Downloads 336653 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models
Authors: Ainouna Bouziane
Abstract:
The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.Keywords: electron tomography, supported catalysts, nanometrology, error assessment
Procedia PDF Downloads 86652 In vivo Mechanical Characterization of Facial Skin Combining Digital Image Correlation and Finite Element
Authors: Huixin Wei, Shibin Wang, Linan Li, Lei Zhou, Xinhao Tu
Abstract:
Facial skin is a biomedical material with complex mechanical properties of anisotropy, viscoelasticity, and hyperelasticity. The mechanical properties of facial skin are crucial for a number of applications including facial plastic surgery, animation, dermatology, cosmetic industry, and impact biomechanics. Skin is a complex multi-layered material which can be broadly divided into three main layers, the epidermis, the dermis, and the hypodermis. Collagen fibers account for 75% of the dry weight of dermal tissue, and it is these fibers which are responsible for the mechanical properties of skin. Many research on the anisotropic mechanical properties are mainly concentrated on in vitro, but there is a great difference between in vivo and in vitro for mechanical properties of the skin. In this study, we presented a method to measure the mechanical properties of facial skin in vivo. Digital image correlation (DIC) and indentation tests were used to obtain the experiment data, including the deformation of facial surface and indentation force-displacement curve. Then, the experiment was simulated using a finite element (FE) model. Application of Computed Tomography (CT) and reconstruction techniques obtained the real tissue geometry. A three-dimensional FE model of facial skin, including a bi-layer system, was obtained. As the epidermis is relatively thin, the epidermis and dermis were regarded as one layer and below it was hypodermis in this study. The upper layer was modeled as a Gasser-Ogden-Holzapfel (GOH) model to describe hyperelastic and anisotropic behaviors of the dermis. The under layer was modeled as a linear elastic model. In conclusion, the material properties of two-layer were determined by minimizing the error between the FE data and experimental data.Keywords: facial skin, indentation test, finite element, digital image correlation, computed tomography
Procedia PDF Downloads 112651 The Effect of Low Power Laser on CK and Some of Markers Delayed Onset Muscle Soreness (DOMS)
Authors: Bahareh Yazdanparast Chaharmahali
Abstract:
The study showed effect of low power laser therapy on knee range of motion (flexion and extension), resting angle of knee joint, knee circumference and rating of delayed onset muscle soreness induced pain, 24 and 48 hours after eccentric training of knee flexor muscle (hamstring muscle). We investigate the effects of pulsed ultrasound on swelling, relaxed, flexion and extension knee angle and pain. 20 volunteers among girl students of college voluntary participated in this research. After eccentric training, subjects were randomly divided into two groups, control and laser therapy. In day 1 and in order to induce delayed onset muscle soreness, subjects eccentrically trained their knee flexor muscles. In day 2, subjects were randomly divided into two groups: control and low power laser therapy. 24 and 48 hours after eccentric training. Variables (knee flexion and extension, srang of motion, resting knee joint angle and knee circumferences) were measured and analyzed. Data are reported as means ± standard error (SE) and repeated measured was used to assess differences within groups. Methods of treatment (low power laser therapy) have significant effects on delayed onset muscle soreness markers. 24 and 48 hours after training a significant difference was observed between mean pains of 2 groups. This difference was significant between low power laser therapy and C groups. The Bonferroni post hock is significant. Low power laser therapy trophy as used in this study did significantly diminish the effects of delayed – onset muscle soreness on swelling, relaxed – knee extension and flexion angle.Keywords: creatine kinase, DOMS, eccentric training, low power laser
Procedia PDF Downloads 246650 Study on the Prediction of Serviceability of Garments Based on the Seam Efficiency and Selection of the Right Seam to Ensure Better Serviceability of Garments
Authors: Md Azizul Islam
Abstract:
Seam is the line of joining two separate fabric layers for functional or aesthetic purposes. Different kinds of seams are used for assembling the different areas or parts of the garment to increase serviceability. To empirically support the importance of seam efficiency on serviceability of garments, this study is focused on choosing the right type of seams for particular sewing parts of the garments based on the seam efficiency to ensure better serviceability. Seam efficiency is the ratio of seam strength and fabric strength. Single jersey knitted finished fabrics of four different GSMs (gram per square meter) were used to make the test garments T-shirt. Three distinct types of the seam: superimposed, lapped and flat seam was applied to the side seams of T-shirt and sewn by lockstitch (stitch class- 301) in a flat-bed plain sewing machine (maximum sewing speed: 5000 rpm) to make (3x4) 12 T-shirts. For experimental purposes, needle thread count (50/3 Ne), bobbin thread count (50/2 Ne) and the stitch density (stitch per inch: 8-9), Needle size (16 in singer system), stitch length (31 cm), and seam allowance (2.5cm) were kept same for all specimens. The grab test (ASTM D5034-08) was done in the Universal tensile tester to measure the seam strength and fabric strength. The produced T-shirts were given to 12 soccer players who wore the shirts for 20 soccer matches (each match of 90 minutes duration). Serviceability of the shirt were measured by visual inspection of a 5 points scale based on the seam conditions. The study found that T-shirts produced with lapped seam show better serviceability and T-shirts made of flat seams perform the lowest score in serviceability score. From the calculated seam efficiency (seam strength/ fabric strength), it was obvious that the performance (in terms of strength) of the lapped and bound seam is higher than that of the superimposed seam and the performance of superimposed seam is far better than that of the flat seam. So it can be predicted that to get a garment of high serviceability, lapped seams could be used instead of superimposed or other types of the seam. In addition, less stressed garments can be assembled by others seems like superimposed seams or flat seams.Keywords: seam, seam efficiency, serviceability, T-shirt
Procedia PDF Downloads 201649 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique
Authors: Rafid Doulab
Abstract:
Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration
Procedia PDF Downloads 116648 Estimating Housing Prices Using Automatic Linear Modeling in the Metropolis of Mashhad, Iran
Authors: Mohammad Rahim Rahnama
Abstract:
Market-transaction price for housing is the main criteria for determining municipality taxes and is determined and announced on an annual basis. Of course, there is a discrepancy between the actual value of transactions in the Bureau of Finance (P for short) or municipality (P´ for short) and the real price on the market (P˝). The present research aims to determine the real price of housing in the metropolis of Mashhad and to pinpoint the price gap with those of the aforementioned apparatuses and identify the factors affecting it. In order to reach this practical objective, Automatic Linear Modeling, which calls for an explanatory research, was utilized. The population of the research consisted of all the residential units in Mashhad, from which 317 residential units were randomly selected. Through cluster sampling, out of the 170 income blocks defined by the municipality, three blocks form high-income (Kosar), middle-income (Elahieh), and low-income (Seyyedi) strata were surveyed using questionnaires during February and March of 2015 and the information regarding the price and specifications of residential units were gathered. In order to estimate the effect of various factors on the price, the relationship between independent variables (8 variables) and the dependent variable of the housing price was calculated using Automatic Linear Modeling in SPSS. The results revealed that the average for housing price index is 788$ per square meter, compared to the Bureau of Finance’s prices which is 10$ and that of municipality’s which is 378$. Correlation coefficient among dependent and independent variables was calculated to be R²=0.81. Out of the eight initial variables, three were omitted. The most influential factor affecting the housing prices is the quality of Quality of construction (Ordinary, Full, Luxury). The least important factor influencing the housing prices is the variable of number of sides. The price gap between low-income (Seyyedi) and middle-income (Elahieh) districts was not confirmed via One-Way ANOVA but their gap with the high-income district (Kosar) was confirmed. It is suggested that city be divided into two low-income and high-income sections, as opposed three, in terms of housing prices.Keywords: automatic linear modeling, housing prices, Mashhad, Iran
Procedia PDF Downloads 255647 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method
Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi
Abstract:
This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure
Procedia PDF Downloads 491646 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 169645 The Effect of Kangaroo Mother Care and Swaddling Method on Venipuncture Pain in Premature Infant: Randomized Clinical Trials
Authors: Faezeh Jahanpour, Shahin Dezhdar, Saeedeh Firouz Bakht, Afshin Ostovar
Abstract:
Objective: The hospitalized premature babies often undergo various painful procedures such as venous sampling. The Kangaroo mother care (KMC) method is one of the pain reduction methods, but as mother’s presence is not always possible, this research was done to compare the effect of swaddling and KMC method on venous sampling pain on premature neonates. Methods: In this randomized clinical trial 90 premature infants selected and randomly alocated into three groups; Group A (swaddling), Group B (the kangaroo care), and group C (the control). From 10 minutes before blood sampling to 2 minutes after that in group A, the infant was wrapped in a thin sheet, and in group B, the infant was under Kangaroo care. In all three groups, the heart rate and arterial oxygen saturation in time intervals of 30 seconds before, during, 30-60-90, and 120 seconds after sampling were measured and recorded. The infant’s face was video recorded since sampling till 2 minutes and the videos were checked by a researcher who was unaware of the kind of intervention and the pain assessment tools for infants (PIPP) for time intervals of 30 seconds were completed. Data analyzed by t-test, Q square, Repeated Measure ANOVA, Kruskal-Wallis, Post-hoc and Bonferroni test. Results: Findings revealed that the pain was reduced to a great extent in swaddling and kangaroo method compared to that in control group. But there was not a significant difference between kangaroo and swaddling care method (P ≥ 0.05). In addition, the findings showed that the heart rate and arterial oxygen saturation was low and stable in swaddling and Kangaroo care method and returned to base status faster, whereas, the changes were severe in control group and did not return to base status even after 120 seconds. Discussion: The results of this study showed that there was not a meaningful difference between swaddling and kangaroo care method on physiological indexes and pain in infants. Therefore, swaddling method can be a good substitute for kangaroo care method in this regard.Keywords: Kangaroo mother care, neonate, pain, premature, swaddling, venipuncture,
Procedia PDF Downloads 215644 Efficiency of Secondary Schools by ICT Intervention in Sylhet Division of Bangladesh
Authors: Azizul Baten, Kamrul Hossain, Abdullah-Al-Zabir
Abstract:
The objective of this study is to develop an appropriate stochastic frontier secondary schools efficiency model by ICT Intervention and to examine the impact of ICT challenges on secondary schools efficiency in the Sylhet division in Bangladesh using stochastic frontier analysis. The Translog stochastic frontier model was found an appropriate than the Cobb-Douglas model in secondary schools efficiency by ICT Intervention. Based on the results of the Cobb-Douglas model, it is found that the coefficient of the number of teachers, the number of students, and teaching ability had a positive effect on increasing the level of efficiency. It indicated that these are related to technical efficiency. In the case of inefficiency effects for both Cobb-Douglas and Translog models, the coefficient of the ICT lab decreased secondary school inefficiency, but the online class in school was found to increase the level of inefficiency. The coefficients of teacher’s preference for ICT tools like multimedia projectors played a contributor role in decreasing the secondary school inefficiency in the Sylhet division of Bangladesh. The interaction effects of the number of teachers and the classrooms, and the number of students and the number of classrooms, the number of students and teaching ability, and the classrooms and teaching ability of the teachers were recorded with the positive values and these have a positive impact on increasing the secondary school efficiency. The overall mean efficiency of urban secondary schools was found at 84.66% for the Translog model, while it was 83.63% for the Cobb-Douglas model. The overall mean efficiency of rural secondary schools was found at 80.98% for the Translog model, while it was 81.24% for the Cobb-Douglas model. So, the urban secondary schools performed better than the rural secondary schools in the Sylhet division. It is observed from the results of the Tobit model that the teacher-student ratio had a positive influence on secondary school efficiency. The teaching experiences of those who have 1 to 5 years and 10 years above, MPO type school, conventional teaching method have had a negative and significant influence on secondary school efficiency. The estimated value of σ-square (0.0625) was different from Zero, indicating a good fit. The value of γ (0.9872) was recorded as positive and it can be interpreted as follows: 98.72 percent of random variation around in secondary school outcomes due to inefficiency.Keywords: efficiency, secondary schools, ICT, stochastic frontier analysis
Procedia PDF Downloads 151643 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc
Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez
Abstract:
The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.Keywords: BLER, LTE, network, qualipoc, SNR.
Procedia PDF Downloads 114642 Study on the Effect of Pre-Operative Patient Education on Post-Operative Outcomes
Authors: Chaudhary Itisha, Shankar Manu
Abstract:
Patient satisfaction represents a crucial aspect in the evaluation of health care services. Preoperative teaching provides the patient with pertinent information concerning the surgical process and the intended surgical procedure as well as anticipated patient behavior (anxiety, fear), expected sensation, and the probable outcomes. Although patient education is part of Accreditation protocols, it is not uniform at most places. The aim of this study was to try to assess the benefit of preoperative patient education on selected post-operative outcome parameters; mainly, post-operative pain scores, requirement of additional analgesia, return to activity of daily living and overall patient satisfaction, and try to standardize few education protocols. Dependent variables were measured before and after the treatment on a study population of 302 volunteers. Educational intervention was provided by the Investigator in the preoperative period to the study group through personal counseling. An information booklet contained detailed information was also provided. Statistical Analysis was done using Chi square test, Mann Whitney u test and Fischer Exact Test on a total of 302 subjects. P value <0.05 was considered as level of statistical significance and p<0.01 was considered as highly significant. This study suggested that patients who are given a structured, individualized and elaborate preoperative education and counseling have a better ability to cope up with postoperative pain in the immediate post-operative period. However, there was not much difference when the patients have had almost complete recovery. There was no difference in the requirement of additional analgesia among the two groups. There is a positive effect of preoperative counseling on expected return to the activities of daily living and normal work schedule. However, no effect was observed on the activities in the immediate post-operative period. There is no difference in the overall satisfaction score among the two groups of patients. Thus this study concludes that there is a positive benefit as suggested by the results for pre-operative patient education. Although the difference in various parameters studied might not be significant over a long term basis, they definitely point towards the benefits of preoperative patient education.Keywords: patient education, post-operative pain, postoperative outcomes, patient satisfaction
Procedia PDF Downloads 339641 Thickness-Tunable Optical, Magnetic, and Dielectric Response of Lithium Ferrite Thin Film Synthesized by Pulsed Laser Deposition
Authors: Prajna Paramita Mohapatra, Pamu Dobbidi
Abstract:
Lithium ferrite (LiFe5O8) has potential applications as a component of microwave magnetic devices such as circulators and monolithic integrated circuits. For efficient device applications, spinel ferrites in the form of thin films are highly required. It is necessary to improve their magnetic and dielectric behavior by optimizing the processing parameters during deposition. The lithium ferrite thin films are deposited on Pt/Si substrate using the pulsed laser deposition technique (PLD). As controlling the film thickness is the easiest parameter to tailor the strain, we deposited the thin films having different film thicknesses (160 nm, 200 nm, 240 nm) at oxygen partial pressure of 0.001 mbar. The formation of single phase with spinel structure (space group - P4132) is confirmed by the XRD pattern and the Rietveld analysis. The optical bandgap is decreased with the increase in thickness. FESEM confirmed the formation of uniform grains having well separated grain boundaries. Further, the film growth and the roughness are analyzed by AFM. The root-mean-square (RMS) surface roughness is decreased from 13.52 nm (160 nm) to 9.34 nm (240 nm). The room temperature magnetization is measured with a maximum field of 10 kOe. The saturation magnetization is enhanced monotonically with an increase in thickness. The magnetic resonance linewidth is obtained in the range of 450 – 780 Oe. The dielectric response is measured in the frequency range of 104 – 106 Hz and in the temperature range of 303 – 473 K. With an increase in frequency, the dielectric constant and the loss tangent of all the samples decreased continuously, which is a typical behavior of conventional dielectric material. The real part of the dielectric constant and the dielectric loss is increased with an increase in thickness. The contribution of grain and grain boundaries is also analyzed by employing the equivalent circuit model. The highest dielectric constant is obtained for the film having a thickness of 240 nm at 104 Hz. The obtained results demonstrate that desired response can be obtained by tailoring the film thickness for the microwave magnetic devices.Keywords: PLD, optical response, thin films, magnetic response, dielectric response
Procedia PDF Downloads 98640 A Hybrid Block Multistep Method for Direct Numerical Integration of Fourth Order Initial Value Problems
Authors: Adamu S. Salawu, Ibrahim O. Isah
Abstract:
Direct solution to several forms of fourth-order ordinary differential equations is not easily obtained without first reducing them to a system of first-order equations. Thus, numerical methods are being developed with the underlying techniques in the literature, which seeks to approximate some classes of fourth-order initial value problems with admissible error bounds. Multistep methods present a great advantage of the ease of implementation but with a setback of several functions evaluation for every stage of implementation. However, hybrid methods conventionally show a slightly higher order of truncation for any k-step linear multistep method, with the possibility of obtaining solutions at off mesh points within the interval of solution. In the light of the foregoing, we propose the continuous form of a hybrid multistep method with Chebyshev polynomial as a basis function for the numerical integration of fourth-order initial value problems of ordinary differential equations. The basis function is interpolated and collocated at some points on the interval [0, 2] to yield a system of equations, which is solved to obtain the unknowns of the approximating polynomial. The continuous form obtained, its first and second derivatives are evaluated at carefully chosen points to obtain the proposed block method needed to directly approximate fourth-order initial value problems. The method is analyzed for convergence. Implementation of the method is done by conducting numerical experiments on some test problems. The outcome of the implementation of the method suggests that the method performs well on problems with oscillatory or trigonometric terms since the approximations at several points on the solution domain did not deviate too far from the theoretical solutions. The method also shows better performance compared with an existing hybrid method when implemented on a larger interval of solution.Keywords: Chebyshev polynomial, collocation, hybrid multistep method, initial value problems, interpolation
Procedia PDF Downloads 122639 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli
Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha
Abstract:
Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus
Procedia PDF Downloads 283638 Heritage Preservation and Cultural Tourism; The 'Pueblos Mágicos' Program and Its Role in Preserving Traditional Architecture in Mexico
Authors: Claudia Rodríguez Espinosa, Erika Elizabeth Pérez Múzquiz
Abstract:
The Pueblos Mágicos federal program tries to preserve the traditional environment of small towns (under 20,000 inhabitants), through economic investments, legislation, and legal aid. To access the program, it’s important to cover 8 requirements; one of them is the fourth, which considers ‘Promotion of symbolic and differentiated touristic attractions, such as architecture, emblematic buildings, festivities and traditions, artisan production, traditional cuisine, and touristic services that guarantee their commercialization along with assistantship and security services’. With this objective in mind, the Federal government of Mexico had developed local programs to protect emblematic public buildings in each of the 83 towns included in the Pueblos Mágicos program that involved federal and local administrations as well as local civil associations, like Adopte una Obra de Arte. In this paper, we present 3 different intervention cases: first the restoration project (now concluded) of the 16th century monastery of Santa María Magdalena in Cuitzeo, an enormous building which took 6 years to be completely restored. Second case, the public spaces intervention in Pátzcuaro, included the Plaza Grande or Vasco de Quiroga square, and the access to the arts and crafts house known as Casa de los once patios or eleven backyards house. The third case is the recovery project of the 16th century atrium of the Tzintzuntzan monastery that included the original olive trees brought by Franciscans monks to this town in the middle 1500’s. This paper tries to present successful preservation projects in 3 different scales: building, urban spaces and landscape; and in 3 different towns with the objective to preserve public architecture, public spaces and cultural traditions. Learn from foreign experiences, different ways to manage preservation projects focused on public architecture and public spaces.Keywords: cultural tourism, heritage preservation, traditional architecture, public policies
Procedia PDF Downloads 289637 An Assessment of Impact of Financial Statement Fraud on Profit Performance of Manufacturing Firms in Nigeria: A Study of Food and Beverage Firms in Nigeria
Authors: Wale Agbaje
Abstract:
The aim of this research study is to assess the impact of financial statement fraud on profitability of some selected Nigerian manufacturing firms covering (2002-2016). The specific objectives focused on to ascertain the effect of incorrect asset valuation on return on assets (ROA) and to ascertain the relationship between improper expense recognition and return on assets (ROA). To achieve these objectives, descriptive research design was used for the study while secondary data were collected from the financial reports of the selected firms and website of security and exchange commission. The analysis of covariance (ANCOVA) was used and STATA II econometric method was used in the analysis of the data. Altman model and operating expenses ratio was adopted in the analysis of the financial reports to create a dummy variable for the selected firms from 2002-2016 and validation of the parameters were ascertained using various statistical techniques such as t-test, co-efficient of determination (R2), F-statistics and Wald chi-square. Two hypotheses were formulated and tested using the t-statistics at 5% level of significance. The findings of the analysis revealed that there is a significant relationship between financial statement fraud and profitability in Nigerian manufacturing industry. It was revealed that incorrect assets valuation has a significant positive relationship and so also is the improper expense recognition on return on assets (ROA) which serves as a proxy for profitability. The implication of this is that distortion of asset valuation and expense recognition leads to decreasing profit in the long run in the manufacturing industry. The study therefore recommended that pragmatic policy options need to be taken in the manufacturing industry to effectively manage incorrect asset valuation and improper expense recognition in order to enhance manufacturing industry performance in the country and also stemming of financial statement fraud should be adequately inculcated into the internal control system of manufacturing firms for the effective running of the manufacturing industry in Nigeria.Keywords: Althman's Model, improper expense recognition, incorrect asset valuation, return on assets
Procedia PDF Downloads 161636 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System
Authors: Benjamin Chijioke Agwah, Paulinus Chinaenye Eze
Abstract:
Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC- VZLC provided fast tracking of desired wheel slip, eliminate chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.Keywords: ABS, fuzzy logic controller, variable zero lag compensator, wheel slip tracking
Procedia PDF Downloads 147635 Hindrances to Effective Delivery of Infrastructural Development Projects in Nigeria’s Built Environment
Authors: Salisu Gidado Dalibi, Sadiq Gumi Abubakar, JingChun Feng
Abstract:
Nigeria’s population is about 190 million and is on the increase annually making it the seventh most populated nation in the world and first in Africa. This population growth comes with its prospects, needs, and challenges especially on the existing and future infrastructure. Infrastructure refers to structures, systems, and facilities serving the economy of a country, city, town, businesses, industries, etc. These include roads, railways lines, bridges, tunnels, ports, stadiums, dams and water projects, power generation plants and distribution grids, information, and communication technology (ICT), etc. The Nigerian government embarked on several infrastructural development projects (IDPs) to address the deficit as the present infrastructure cannot cater to the needs nor sustain the country. However, delivering such IDPs have not been smooth; comes with challenges from within and outside the project; frequent delays and abandonment. Thus, affecting all the stakeholders involved. Hence, the aim of this paper is to identify and assess the factors that are hindering the effective delivery of IDPs in Nigeria’s built environment with the view to offer more insight into such factors, and ways to address them. The methodology adopted in this study involves the use of secondary sources of data from several materials (official publications, journals, newspapers, internet, etc.) were reviewed within the IDPs field by laying more emphasis on Nigeria’s cases. The hindrance factors in this regard were identified which forms the backbone of the questionnaire. A pilot survey was used to test its suitability; after which it was randomly administered to various project professionals in Nigeria’s construction industry using a 5-point Likert scale format to ascertain the impact of these hindrances. Cronbach’s Alpha reliability test, mean item score computations, relative importance indices, T-test, Chi-Square statistics were used for data analyses. The results outline the impact of various internal, external and project related factors that are hindering IDPs within Nigeria’s built environment.Keywords: built environment, development, factors, hindrances, infrastructure, Nigeria, project
Procedia PDF Downloads 177634 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults
Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura
Abstract:
The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing
Procedia PDF Downloads 283633 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA
Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell
Abstract:
Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis
Procedia PDF Downloads 230632 Improving Fingerprinting-Based Localization System Using Generative AI
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. It also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 59631 The Communicational Behaviors of the Nurses Towards 'Crying Patient'
Authors: Hacer Kobya Bulut, Kıymet Yeşilçiçek Çalık, Birsel Canan Demirbağ, Hacer Erdöl, Songül Aktaş
Abstract:
Introduction: As an expression of an emotion which always exists in life, crying is regarded as one of the problematic behaviors of patients by nurses. Towards such patients, nurses may exhibit emotional and behavioral reactions such as feeling helpless, anger, indifferent, defense, and opposition. However crying either meets a need, reduces the tension to cope with problems or helps patient to gain strength. Therefore, nurses must accept that crying is a normal mechanism that reduces emotional tension and should approach a crying patient accordingly. Objective: This study was carried out to evaluate the communicational behaviors of the nurses towards ‘crying patient’. Methods: This descriptive study was conducted with the nurses working at a university hospital in a city in the Eastern Black Sea in June-September 2015. The entire universe was tried to be reached without sampling. 90% of the population was reached and the study was completed with 309 nurses who volunteered to participate in the study. Data were collected through a questionnaire which was prepared reviewing the literature by researchers. Data were evaluated in SPSS analysis program using percentages, numbers and chi-square test with the 95% confidence interval and p <0.05significance level. Findings: The findings showed that the average age of nurses was 31.52 ± 7.96, work experience was 10:09 ± 7.69 and only 22.7% had training about ‘approach to crying patient’ during their education. 97.1% of the nurses often faced with crying patients in their professional lives, 62.8% stated that they faced crying women patients. When they see crying patients, 84.8% of the nurses ‘do not want the patient to cry’, 80.9% wonder ‘why they are crying’, % 79.6 ‘feel uneasiness’,% 79.3 ‘feel sorry’ and 41.4% ‘ feel helpless’. The question ‘Why do you think the patient is crying?’ was answered by 93.5% nurses as ‘they are suffering’, by 86.1% ‘they are helpless’, 80.9% ‘they are sad’, 79.6% ‘they need help’, 54.4% ‘because they feel inadequate,’ and 44.7% ‘they fail to control their crying behavior. ‘How do you approach to your patient when she/he is crying?’ question was answered by 82.5% of nurses as ‘I would console’, 77.3% as ‘I would ask the reason’, 63.1% as ‘I would try to stop her from crying’ all of which are actually inappropriate nursing approaches. However, 92.2% of the nurses stated that ‘I do not judge the crying patient’, ‘87.1% said ‘I allocate time to crying patients’ and 85.8% said ‘ I ask patient whether they want to cry alone’. The study showed that educational background and work experience of the nurses affected the appropriate approach to crying patients (P <0.05). Conclusion: As a result of the study, it was found out that nurses do not want patients to cry, so they exhibit inappropriate approach such as consoling the patients and they have difficulty in approaching crying patients.Keywords: approach to patient, communication, crying patient, nurse, Turkey
Procedia PDF Downloads 205630 Experimental Study on Different Load Operation and Rapid Load-change Characteristics of Pulverized Coal Combustion with Self-preheating Technology
Authors: Hongliang Ding, Ziqu Ouyang
Abstract:
Under the basic national conditions that the energy structure is dominated by coal, it is of great significance to realize deep and flexible peak shaving of boilers in pulverized coal power plants, and maximize the consumption of renewable energy in the power grid, to ensure China's energy security and scientifically achieve the goals of carbon peak and carbon neutrality. With the promising self-preheating combustion technology, which had the potential of broad-load regulation and rapid response to load changes, this study mainly investigated the different load operation and rapid load-change characteristics of pulverized coal combustion. Four effective load-stabilization bases were proposed according to preheating temperature, coal gas composition (calorific value), combustion temperature (spatial mean temperature and mean square temperature fluctuation coefficient), and flue gas emissions (CO and NOx concentrations), on the basis of which the load-change rates were calculated to assess the load response characteristics. Due to the improvement of the physicochemical properties of pulverized coal after preheating, stable ignition and combustion conditions could be obtained even at a low load of 25%, with a combustion efficiency of over 97.5%, and NOx emission reached the lowest at 50% load, with the concentration of 50.97 mg/Nm3 (@6%O2). Additionally, the load ramp-up stage displayed higher load-change rates than the load ramp-down stage, with maximum rates of 3.30 %/min and 3.01 %/min, respectively. Furthermore, the driving force formed by high step load was conducive to the increase of load-change rate. The rates based on the preheating indicator attained the highest value of 3.30 %/min, while the rates based on the combustion indicator peaked at 2.71 %/min. In comparison, the combustion indicator accurately described the system’s combustion state and load changes, whereas the preheating indicator was easier to acquire, with a higher load-change rate, hence the appropriate evaluation strategy should depend on the actual situation. This study verified a feasible method for deep and flexible peak shaving of coal-fired power units, further providing basic data and technical supports for future engineering applications.Keywords: clean coal combustion, load-change rate, peak shaving, self-preheating
Procedia PDF Downloads 68