Search results for: improved sparrow search algorithm
6665 Multilocal Youth and the Berlin Digital Industry: Productive Leisure as a Key Factor in European Migration
Authors: Stefano Pelaggi
Abstract:
The research is focused on youth labor and mobility in Berlin. Mobility has become a common denominator in our daily lives but it does not primarily move according to monetary incentives. Labor, knowledge and leisure overlap on this point as cities are trying to attract people who could participate in production of the innovations while the new migrants are experiencing the lifestyle of the host cities. The research will present the project of empirical study focused on Italian workers in the digital industry in Berlin, trying to underline the connection between pleasure, leisure with the choice of life abroad. Berlin has become the epicenter of the European Internet start-up scene, but people suitable to work for digital industries are not moving in Berlin to make a career, most of them are attracted to the city for different reasons. This point makes a clear exception to traditional migration flows, which are always originated from a specific search of employment opportunities or strong ties, usually families, in a place that could guarantee success in finding a job. Even the skilled migration has always been originated from a specific need, finding the right path for a successful professional life. In a society where the lack of free time in our calendar seems to be something to be ashamed, the actors of youth mobility incorporate some categories of experiential tourism within their own life path. Professional aspirations, lifestyle choices of the protagonists of youth mobility are geared towards meeting the desires and aspirations that define leisure. While most of creative work places, in particular digital industries, uses the category of fun as a primary element of corporate policy, virtually extending the time to work for the whole day; more and more people around the world are deciding their path in life, career choices on the basis of indicators linked to the realization of the self, which may include factors like a warm climate, cultural environment. All indicators that are usually eradicated from the hegemonic approach to labor. The interpretative framework commonly used seems to be mostly focused on a dualism between Florida's theories and those who highlight the absence of conflict in his studies. While the flexibility of the new creative industries is minimizing leisure, incorporating elements of leisure itself in work activities, more people choose their own path of life by placing great importance to basic needs, through a gaze on pleasure that is only partially driven by consumption. The multi localism is the co-existence of different identities and cultures that do not conflict because they reject the bind on territory. Local loses its strength of opposition to global, with an attenuation of the whole concept of citizenship, territory and even integration. A similar perspective could be useful to search a new approach to all the studies dedicated to the gentrification process, while studying the new migrations flow.Keywords: brain drain, digital industry, leisure and gentrification, multi localism
Procedia PDF Downloads 2436664 Service-Oriented Enterprise Architecture (SoEA) Adoption and Maturity Measurement Model: A Systematic Review
Authors: Nur Azaliah Abu Bakar, Harihodin Selamat, Mohd Nazri Kama
Abstract:
This article provides a systematic review of existing research related to the Service-oriented Enterprise Architecture (SoEA) adoption and maturity measurement model. The review’s main goals are to support research, to facilitate other researcher’s search for relevant studies and to propose areas for future studies within this area. In addition, this article provides useful information on SoEA adoption issues and its related maturity model, based on research-based knowledge. The review results suggest that motives, critical success factors (CSFs), implementation status and benefits are the most frequently studied areas and that each of these areas would benefit from further exposure.Keywords: systematic literature review, service-oriented architecture, adoption, maturity model
Procedia PDF Downloads 3246663 Phelipanche Ramosa (L. - Pomel) Control in Field Tomato Crop
Authors: G. Disciglio, F. Lops, A. Carlucci, G. Gatta, A. Tarantino, L. Frabboni, F. Carriero, F. Cibelli, M. L. Raimondo, E. Tarantino
Abstract:
The Phelipanche ramosa is is an important crop whose cultivation in the Mediterranean basin is severely contained the phitoparasitic weed Phelipanche ramose. The semiarid regions of the world are considered the main center of this parasitic weed, where heavy infestation is due to the ability to produce high numbers of seeds (up to 500,000 per plant), that remain viable for extended period (more than 19 years). In this paper 12 treatments of parasitic weed control including chemical, agronomic, biological and biotechnological methods have been carried out. In 2014 a trial was performed at Foggia (southern Italy). on processing tomato (cv Docet), grown in field infested by Phelipanche ramosa, Tomato seedlings were transplant on May 5, 2014 on a clay-loam soil (USDA) fertilized by 100 kg ha-1 of N; 60 kg ha-1 of P2O5 and 20 kg ha-1 of S. Afterwards, top dressing was performed with 70 kg ha-1 of N. The randomized block design with 3 replicates was adopted. During the growing cycle of the tomato, at 56-78 and 92 days after transplantation, the number of parasitic shoots emerged in each pot was detected. At harvesting, on August 18, the major quantity-quality yield parameters were determined (marketable yield, mean weight, dry matter, pH, soluble solids and color of fruits). All data were subjected to analysis of variance (ANOVA), using the JMP software (SAS Institute Inc., Cary, NC, USA), and for comparison of means was used Tukey's test. Each treatment studied did not provide complete control against Phelipanche ramosa. However among the 12 tested methods, Fusarium, gliphosate, radicon biostimulant and Red Setter tomato cv (improved genotypes obtained by Tilling technology) proved to mitigate the virulence of the attacks of Phelipanche ramose. It is assumed that these effects can be improved by combining some of these treatments each other, especially for a gradual and continuing reduction of the “seed bank” of the parasite in the soil.Keywords: control methods, Phelipanche ramosa, tomato crop, mediterranean basin
Procedia PDF Downloads 5636662 A Bibliometric Analysis of Research on E-learning in Physics Education: Trends, Patterns, and Future Directions
Authors: Siti Nurjanah, Supahar
Abstract:
E-learning has become an increasingly popular mode of instruction, particularly in the field of physics education, where it offers opportunities for interactive and engaging learning experiences. This research aims to analyze the trends of research that investigated e-learning in physics education. Data was extracted from Scopus's database using the keywords "physics" and "e-learning". Of the 380 articles obtained based on the search criteria, a trend analysis of the research was carried out with the help of RStudio using the biblioshiny package and VosViewer software. Analysis showed that publications on this topic have increased significantly from 2014 to 2021. The publication was dominated by researchers from the United States. The main journal that publishes articles on this topic is Proceedings Frontiers in Education Conference fie. The most widely cited articles generally focus on the effectiveness of Moodle for physics learning. Overall, this research provides an in-depth understanding of the trends and key findings of research related to e-learning in physics.Keywords: bibliometric analysis, physics education, biblioshiny, E-learning
Procedia PDF Downloads 416661 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations
Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li
Abstract:
The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling
Procedia PDF Downloads 1556660 Effect of Baffles on the Cooling of Electronic Components
Authors: O. Bendermel, C. Seladji, M. Khaouani
Abstract:
In this work, we made a numerical study of the thermal and dynamic behaviour of air in a horizontal channel with electronic components. The influence to use baffles on the profiles of velocity and temperature is discussed. The finite volume method and the algorithm Simple are used for solving the equations of conservation of mass, momentum and energy. The results found show that baffles improve heat transfer between the cooling air and electronic components. The velocity will increase from 3 times per rapport of the initial velocity.Keywords: electronic components, baffles, cooling, fluids engineering
Procedia PDF Downloads 2976659 Numerical Investigation on the Effect of Aluminium Nanoparticles on Characteristic Velocity of Kerosene-Oxygen Combustion
Authors: Al Ameen H., Rakesh P.
Abstract:
To improve the combustion efficiency of fuels and to reduce the emissions of pollutants as well as to improve heat transfer characteristics of fuels, both non-metallic and metallic nanoparticles can be added into it. By varying the concentration and size of nano particles added into the fuels, behaviour of droplet combustion and hence heat generated can be altered. In case of solid or liquid fuels, surface area of the fuel in contact with oxidizer(gaseous) is small because of higher density compared to gases. If the surface area of fuel exposed to the oxidizer is very small, then the combustion will not occur, because the combustion rate is proportional to the surface area of fuel droplet. To avoid such instance there is a way to increase the exposed surface area. To increase the specific surface area available for reaction, the particle size can be reduced. If the additives are solid then by reducing the particles size the specific surface area of liquid fuel can be increased. For the liquid fuels the exposed surface area available for combustion can be increased by suspending nanoparticles. Addition of non-metallic and metallic nanoparticles in fuels improves its combustion efficiency by enhancing the thermo-physical properties. The burn rate constants and temperatures of Kerosene-Oxygen combustion for fuel droplet sizes of 50μm, 75μm, 100μm and 125μm under varying concentrations of 25%, 50%, 75% and 100% are studied numerically and its characteristic velocities are determined. Later the burn rate constants of fuel with concentrations of 0.5%, 1.0% and 2.0% by weight of aluminium nanoparticles are added. The spray combustion characteristics of such nano-fuel has improved the combustion temperature by the addition of aluminium nanoparticles. Thus, aluminium nanoparticles have improved burn rate and characteristic velocity of Kerosene-Oxygen combustion. An increase of 40% in characteristic velocity is observed.Keywords: burn rate, characteristic velocity, combustion, thermo-physical properties
Procedia PDF Downloads 946658 Tinder, Image Merchandise and Desire: The Configuration of Social Ties in Today's Neoliberalism
Authors: Daniel Alvarado Valencia
Abstract:
Nowadays, the market offers us solutions for everything, creating the idea of an immediate availability of anything we could desire, and the Internet is the mean through which to obtain all this. The proposal of this conference is that this logic puts the subjects in a situation of self-exploitation, and considers the psyche as a productive force by configuring affection and desire from a neoliberal value perspective. It uses Tinder, starting from ethnographical data from Mexico City users, as an example for this. Tinder is an application created to get dates, have sexual encounters and find a partner. It works from the creation and management of a digital profile. It is an example of how futuristic and lonely the current era can be since we got used to interact with other people through screens and images. However, at the same time, it provides solutions to loneliness, since technology transgresses, invades and alters social practices in different ways. Tinder fits into this contemporary context, it is a concrete example of the processes of technification in which social bonds develop through certain devices offered by neoliberalism, through consumption, and where the search of love and courtship are possible through images and their consumption.Keywords: desire, image, merchandise, neoliberalism
Procedia PDF Downloads 1216657 The Impact of Social Support on Anxiety and Depression under the Context of COVID-19 Pandemic: A Scoping Review and Meta-Analysis
Authors: Meng Wu, Atif Rahman, Eng Gee, Lim, Jeong Jin Yu, Rong Yan
Abstract:
Context: The COVID-19 pandemic has had a profound impact on mental health, with increased rates of anxiety and depression observed. Social support, a critical factor in mental well-being, has also undergone significant changes during the pandemic. This study aims to explore the relationship between social support, anxiety, and depression during COVID-19, taking into account various demographic and contextual factors. Research Aim: The main objective of this study is to conduct a comprehensive systematic review and meta-analysis to examine the impact of social support on anxiety and depression during the COVID-19 pandemic. The study aims to determine the consistency of these relationships across different age groups, occupations, regions, and research paradigms. Methodology: A scoping review and meta-analytic approach were employed in this study. A search was conducted across six databases from 2020 to 2022 to identify relevant studies. The selected studies were then subjected to random effects models, with pooled correlations (r and ρ) estimated. Homogeneity was assessed using Q and I² tests. Subgroup analyses were conducted to explore variations across different demographic and contextual factors. Findings: The meta-analysis of both cross-sectional and longitudinal studies revealed significant correlations between social support, anxiety, and depression during COVID-19. The pooled correlations (ρ) indicated a negative relationship between social support and anxiety (ρ = -0.30, 95% CI = [-0.333, -0.255]) as well as depression (ρ = -0.27, 95% CI = [-0.370, -0.281]). However, further investigation is required to validate these results across different age groups, occupations, and regions. Theoretical Importance: This study emphasizes the multifaceted role of social support in mental health during the COVID-19 pandemic. It highlights the need to reevaluate and expand our understanding of social support's impact on anxiety and depression. The findings contribute to the existing literature by shedding light on the associations and complexities involved in these relationships. Data Collection and Analysis Procedures: The data collection involved an extensive search across six databases to identify relevant studies. The selected studies were then subjected to rigorous analysis using random effects models and subgroup analyses. Pooled correlations were estimated, and homogeneity was assessed using Q and I² tests. Question Addressed: This study aimed to address the question of the impact of social support on anxiety and depression during the COVID-19 pandemic. It sought to determine the consistency of these relationships across different demographic and contextual factors. Conclusion: The findings of this study highlight the significant association between social support, anxiety, and depression during the COVID-19 pandemic. However, further research is needed to validate these findings across different age groups, occupations, and regions. The study emphasizes the need for a comprehensive understanding of social support's multifaceted role in mental health and the importance of considering various contextual and demographic factors in future investigations.Keywords: social support, anxiety, depression, COVID-19, meta-analysis
Procedia PDF Downloads 626656 Ascidian Styela rustica Proteins’ Structural Domains Predicted to Participate in the Tunic Formation
Authors: M. I. Tyletc, O. I. Podgornya, T. G. Shaposhnikova, S. V. Shabelnikov, A. G. Mittenberg, M. A. Daugavet
Abstract:
Ascidiacea is the most numerous class of the Tunicata subtype. These chordates' distinctive feature of the anatomical structure is a tunic consisting of cellulose fibrils, protein molecules, and single cells. The mechanisms of the tunic formation are not known in detail; tunic formation could be used as the model system for studying the interaction of cells with the extracellular matrix. Our model species is the ascidian Styela rustica, which is prevalent in benthic communities of the White Sea. As previously shown, the tunic formation involves morula blood cells, which contain the major 48 kDa protein p48. P48 participation in the tunic formation was proved using antibodies against the protein. The nature of the protein and its function remains unknown. The current research aims to determine the amino acid sequence of p48, as well as to clarify its role in the tunic formation. The peptides that make up the p48 amino acid sequence were determined by mass spectrometry. A search for peptides in protein sequence databases identified sequences homologous to p48 in Styela clava, Styela plicata, and Styela canopus. Based on sequence alignment, their level of similarity was determined as 81-87%. The correspondent sequence of ascidian Styela canopus was used for further analysis. The Styela rustica p48 sequence begins with a signal peptide, which could indicate that the protein is secretory. This is consistent with experimentally obtained data: the contents of morula cells secreted in the tunic matrix. The isoelectric point of p48 is 9.77, which is consistent with the experimental results of acid electrophoresis of morula cell proteins. However, the molecular weight of the amino acid sequence of ascidian Styela canopus is 103 kDa, so p48 of Styela rustica is a shorter homolog. The search for conservative functional domains revealed the presence of two Ca-binding EGF-like domains, thrombospondin (TSP1) and tyrosinase domains. The p48 peptides determined by mass spectrometry fall into the region of the sequence corresponding to the last two domains and have amino acid substitutions as compared to Styela canopus homolog. The tyrosinase domain (pfam00264) is known to be part of the phenoloxidase enzyme, which participates in melanization processes and the immune response. The thrombospondin domain (smart00209) interacts with a wide range of proteins, and is involved in several biological processes, including coagulation, cell adhesion, modulation of intercellular and cell-matrix interactions, angiogenesis, wound healing and tissue remodeling. It can be assumed that the tyrosinase domain in p48 plays the role of the phenoloxidase enzyme, and TSP1 provides a link between the extracellular matrix and cell surface receptors, and may also be responsible for the repair of the tunic. The results obtained are consistent with experimental data on p48. The domain organization of protein suggests that p48 is an enzyme involved in the tunic tunning and is an important regulator of the organization of the extracellular matrix.Keywords: ascidian, p48, thrombospondin, tyrosinase, tunic, tunning
Procedia PDF Downloads 1156655 Performance Assessment of Carrier Aggregation-Based Indoor Mobile Networks
Authors: Viktor R. Stoynov, Zlatka V. Valkova-Jarvis
Abstract:
The intelligent management and optimisation of radio resource technologies will lead to a considerable improvement in the overall performance in Next Generation Networks (NGNs). Carrier Aggregation (CA) technology, also known as Spectrum Aggregation, enables more efficient use of the available spectrum by combining multiple Component Carriers (CCs) in a virtual wideband channel. LTE-A (Long Term Evolution–Advanced) CA technology can combine multiple adjacent or separate CCs in the same band or in different bands. In this way, increased data rates and dynamic load balancing can be achieved, resulting in a more reliable and efficient operation of mobile networks and the enabling of high bandwidth mobile services. In this paper, several distinct CA deployment strategies for the utilisation of spectrum bands are compared in indoor-outdoor scenarios, simulated via the recently-developed Realistic Indoor Environment Generator (RIEG). We analyse the performance of the User Equipment (UE) by integrating the average throughput, the level of fairness of radio resource allocation, and other parameters, into one summative assessment termed a Comparative Factor (CF). In addition, comparison of non-CA and CA indoor mobile networks is carried out under different load conditions: varying numbers and positions of UEs. The experimental results demonstrate that the CA technology can improve network performance, especially in the case of indoor scenarios. Additionally, we show that an increase of carrier frequency does not necessarily lead to improved CF values, due to high wall-penetration losses. The performance of users under bad-channel conditions, often located in the periphery of the cells, can be improved by intelligent CA location. Furthermore, a combination of such a deployment and effective radio resource allocation management with respect to user-fairness plays a crucial role in improving the performance of LTE-A networks.Keywords: comparative factor, carrier aggregation, indoor mobile network, resource allocation
Procedia PDF Downloads 1786654 Migratory Diaspora: The Media and the Human Element
Authors: Peter R. Alfieri
Abstract:
The principal aim of this research and presentation is to give global and personal perspective of the migratory diaspora and how it is perceived by a substantial majority that relies on the media’s portrayal of migratory movements. Since its Greek origins the word “diaspora” has taken on several connotations, but none has surpassed its use in regard to the human element; because since before the dawn of history, man has had to struggle for survival. That survival was a struggle against the elements and other natural enemies, but none as tenacious and relentless as other men. Many have used the term diaspora to describe the spread of certain ethnic groups resulting in new generations in new places; but has the human diaspora been as haphazard as that of spores? The quest for survival has spawned migrations that are not quite that simple, even though it has several similarities to plant spores or dandelion seeds flying throughout the atmosphere. Man kind has constantly migrated in search of food, shelter, and safety. When they were able to find food and shelter, they would inform others who would venture to the new place. Information, whether through word of mouth, written material, or visual communications, has been a moving force in man’s life; and it spurred migrants in their quest for better environments. Today we pride ourselves in being able to communicate instantly with anyone anywhere in the world, and we are privileged to see most of what is happening in the world thanks to the highly developed modern media. Is Media a “wind/force” instrumental in propelling the diaspora throughout the world? The media has been the tool that has incentive many migratory, but unfortunately it is also the means responsible for many misconceptions regarding migrants and their hosts. Has the Media presented an unbiased view of the migrant or has it been the means that generated negative or prejudiced views of the migrant and, perhaps, the host environment? Some examples were easily seen in 19th century the United States where they advertised the following, “Help needed, Irish need not apply”. How do immigrants circumvent latent barriers that are not as obvious as the ones just mentioned? Some immigrants return home and have children that decide to emigrate. It is a perpetual cycle in the search for self-improvement. The stories that are brought back might be inspiration for the new generation of emigrants. Poverty, hunger, and political turmoil spur most migrations. The majority learn from others or through the media about certain destinations that will provide one or several opportunities to improve their existence. Many of those migrants suffer untold hardships to succeed. When they succeed, they provide a great incentive for their children to obtain an education or skill that will insure them a better life. Although the new environment may contribute greatly to a successful career, most immigrants do not forget their own struggle. They see the media’s portrayal of other migrants from all over the globe. Some try to communicate to others the true feelings of despair felt by immigrants, because they are all brothers and sisters in the perennial struggle for a better life. “HOPE” for a better life drives the immigrant toward the unknown and it has helped overcome the obstacles that present themselves challenging every newcomer. Hope and perseverance strengthen the resolve of the migrant in his struggle to survive.Keywords: media, migration, heath, education, obstacles
Procedia PDF Downloads 3846653 Ecological Ice Hockey Butterfly Motion Assessment Using Inertial Measurement Unit Capture System
Authors: Y. Zhang, J. Perez, S. Marnier
Abstract:
To date, no study on goaltending butterfly motion has been completed in real conditions, during an ice hockey game or training practice, to the author's best knowledge. This motion, performed to save score, is unnatural, intense, and repeated. The target of this research activity is to identify representative biomechanical criteria for this goaltender-specific movement pattern. Determining specific physical parameters may allow to will identify the risk of hip and groin injuries sustained by goaltenders. Four professional or academic goalies were instrumented during ice hockey training practices with five inertial measurement units. These devices were inserted in dedicated pockets located on each thigh and shank, and the fifth on the lumbar spine. A camera was also installed close to the ice to observe and record the goaltenders' activities, especially the butterfly motions, in order to synchronize the captured data and the behavior of the goaltender. Each data recorded began with a calibration of the inertial units and a calibration of the fully equipped goaltender on the ice. Three butterfly motions were recorded out of the training practice to define referential individual butterfly motions. Then, a data processing algorithm based on the Madgwick filter computed hip and knee joints joint range of motion as well as angular specific angular velocities. The developed algorithm software automatically identified and analyzed all the butterfly motions executed by the four different goaltenders. To date, it is still too early to show that the analyzed criteria are representative of the trauma generated by the butterfly motion as the research is only at its beginning. However, this descriptive research activity is promising in its ecological assessment, and once the criteria are found, the tools and protocols defined will allow the prevention of as many injuries as possible. It will thus be possible to build a specific training program for each goalie.Keywords: biomechanics, butterfly motion, human motion analysis, ice hockey, inertial measurement unit
Procedia PDF Downloads 1256652 Optimal Design of Wind Turbine Blades Equipped with Flaps
Authors: I. Kade Wiratama
Abstract:
As a result of the significant growth of wind turbines in size, blade load control has become the main challenge for large wind turbines. Many advanced techniques have been investigated aiming at developing control devices to ease blade loading. Amongst them, trailing edge flaps have been proven as effective devices for load alleviation. The present study aims at investigating the potential benefits of flaps in enhancing the energy capture capabilities rather than blade load alleviation. A software tool is especially developed for the aerodynamic simulation of wind turbines utilising blades equipped with flaps. As part of the aerodynamic simulation of these wind turbines, the control system must be also simulated. The simulation of the control system is carried out via solving an optimisation problem which gives the best value for the controlling parameter at each wind turbine run condition. Developing a genetic algorithm optimisation tool which is especially designed for wind turbine blades and integrating it with the aerodynamic performance evaluator, a design optimisation tool for blades equipped with flaps is constructed. The design optimisation tool is employed to carry out design case studies. The results of design case studies on wind turbine AWT 27 reveal that, as expected, the location of flap is a key parameter influencing the amount of improvement in the power extraction. The best location for placing a flap is at about 70% of the blade span from the root of the blade. The size of the flap has also significant effect on the amount of enhancement in the average power. This effect, however, reduces dramatically as the size increases. For constant speed rotors, adding flaps without re-designing the topology of the blade can improve the power extraction capability as high as of about 5%. However, with re-designing the blade pretwist the overall improvement can be reached as high as 12%.Keywords: flaps, design blade, optimisation, simulation, genetic algorithm, WTAero
Procedia PDF Downloads 3376651 Medical/Surgical Skills Day Improves Nurse Competence and Satisfaction
Authors: Betsy Hannam
Abstract:
Background: Staff nurses felt overwhelmed to learn new skills or complete competencies during their shift. Med/Surg units need to provide dedicated, uninterrupted time to complete training and mandatory competencies and practice skills. Purpose: To improve nurse satisfaction and competence by creating a Skills Day with uninterrupted time to complete competencies, brush up on skills, and evaluate skills learned through pre- and post-tests. Methods: The USL and CNL interviewed nurses to obtain input regarding skills needing reinforcement and included mandatory competencies relevant to Med/Surg to create the Skills Day agenda. Content experts from multiple disciplines were invited to educate staff to help address knowledge gaps. To increase attendance, multiple class days were offered. Results: 2018 Skills Day was held for an inpatient unit with 95% participation (n=35 out of 37RNs). The average pretest score, comprised of content questions from topics discussed, was 57%, and post test scoresaveraged 80%. 94% of test scores improved or remained the same. RNs were given an evaluation at the end of the day, where100% of staff noted Skills Day as beneficial, and 97% requested to repeat next year. Another Med/Surg unit asked to join Skills Day in 2019. In 2019, with 89% participation (n=57 out 64 RNs), the average pretest score was 68%, and the average post test score was 85%. 97% of scores improved or remained the same. 98% reported the class as beneficial, and 96% requested to repeat next year. Skills Day 2020-2022 on hold due to COVID. Looking forward to Skills Day 2023. Conclusion: Skills Day allows nurses to maintain competencies and improve knowledge in areas of interest without the stress of a patient assignment. Having unit leaders organize Skills Day, with the involvement of content experts from multiple disciplines, showed to be a successful and innovative team approach to support professional development.Keywords: education, competency, skills day, medical/surgical
Procedia PDF Downloads 1006650 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images
Authors: Amit Kumar Happy
Abstract:
This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform
Procedia PDF Downloads 1156649 Enhanced Disk-Based Databases towards Improved Hybrid in-Memory Systems
Authors: Samuel Kaspi, Sitalakshmi Venkatraman
Abstract:
In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable in-memory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of disk-based database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of in-memory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.Keywords: in-memory database, disk-based system, hybrid database, concurrency control
Procedia PDF Downloads 4176648 Procedure to Optimize the Performance of Chemical Laser Using the Genetic Algorithm Optimizations
Authors: Mohammedi Ferhate
Abstract:
This work presents details of the study of the entire flow inside the facility where the exothermic chemical reaction process in the chemical laser cavity is analyzed. In our paper we will describe the principles of chemical lasers where flow reversal is produced by chemical reactions. We explain the device for converting chemical potential energy laser energy. We see that the phenomenon thus has an explosive trend. Finally, the feasibility and effectiveness of the proposed method is demonstrated by computer simulationKeywords: genetic, lasers, nozzle, programming
Procedia PDF Downloads 946647 Impact of Combined Heat and Power (CHP) Generation Technology on Distribution Network Development
Authors: Sreto Boljevic
Abstract:
In the absence of considerable investment in electricity generation, transmission and distribution network (DN) capacity, the demand for electrical energy will quickly strain the capacity of the existing electrical power network. With anticipated growth and proliferation of Electric vehicles (EVs) and Heat pump (HPs) identified the likelihood that the additional load from EV changing and the HPs operation will require capital investment in the DN. While an area-wide implementation of EVs and HPs will contribute to the decarbonization of the energy system, they represent new challenges for the existing low-voltage (LV) network. Distributed energy resources (DER), operating both as part of the DN and in the off-network mode, have been offered as a means to meet growing electricity demand while maintaining and ever-improving DN reliability, resiliency and power quality. DN planning has traditionally been done by forecasting future growth in demand and estimating peak load that the network should meet. However, new problems are arising. These problems are associated with a high degree of proliferation of EVs and HPs as load imposes on DN. In addition to that, the promotion of electricity generation from renewable energy sources (RES). High distributed generation (DG) penetration and a large increase in load proliferation at low-voltage DNs may have numerous impacts on DNs that create issues that include energy losses, voltage control, fault levels, reliability, resiliency and power quality. To mitigate negative impacts and at a same time enhance positive impacts regarding the new operational state of DN, CHP system integration can be seen as best action to postpone/reduce capital investment needed to facilitate promotion and maximize benefits of EVs, HPs and RES integration in low-voltage DN. The aim of this paper is to generate an algorithm by using an analytical approach. Algorithm implementation will provide a way for optimal placement of the CHP system in the DN in order to maximize the integration of RES and increase in proliferation of EVs and HPs.Keywords: combined heat & power (CHP), distribution networks, EVs, HPs, RES
Procedia PDF Downloads 2026646 Tailoring the Parameters of the Quantum MDS Codes Constructed from Constacyclic Codes
Authors: Jaskarn Singh Bhullar, Divya Taneja, Manish Gupta, Rajesh Kumar Narula
Abstract:
The existence conditions of dual containing constacyclic codes have opened a new path for finding quantum maximum distance separable (MDS) codes. Using these conditions parameters of length n=(q²+1)/2 quantum MDS codes were improved. A class of quantum MDS codes of length n=(q²+q+1)/h, where h>1 is an odd prime, have also been constructed having large minimum distance and these codes are new in the sense as these are not available in the literature.Keywords: hermitian construction, constacyclic codes, cyclotomic cosets, quantum MDS codes, singleton bound
Procedia PDF Downloads 3886645 Cooperative Agents to Prevent and Mitigate Distributed Denial of Service Attacks of Internet of Things Devices in Transportation Systems
Authors: Borhan Marzougui
Abstract:
Road and Transport Authority (RTA) is moving ahead with the implementation of the leader’s vision in exploring all avenues that may bring better security and safety services to the community. Smart transport means using smart technologies such as IoT (Internet of Things). This technology continues to affirm its important role in the context of Information and Transportation Systems. In fact, IoT is a network of Internet-connected objects able to collect and exchange different data using embedded sensors. With the growth of IoT, Distributed Denial of Service (DDoS) attacks is also growing exponentially. DDoS attacks are the major and a real threat to various transportation services. Currently, the defense mechanisms are mainly passive in nature, and there is a need to develop a smart technique to handle them. In fact, new IoT devices are being used into a botnet for DDoS attackers to accumulate for attacker purposes. The aim of this paper is to provide a relevant understanding of dangerous types of DDoS attack related to IoT and to provide valuable guidance for the future IoT security method. Our methodology is based on development of the distributed algorithm. This algorithm manipulates dedicated intelligent and cooperative agents to prevent and to mitigate DDOS attacks. The proposed technique ensure a preventive action when a malicious packets start to be distributed through the connected node (Network of IoT devices). In addition, the devices such as camera and radio frequency identification (RFID) are connected within the secured network, and the data generated by it are analyzed in real time by intelligent and cooperative agents. The proposed security system is based on a multi-agent system. The obtained result has shown a significant reduction of a number of infected devices and enhanced the capabilities of different security dispositives.Keywords: IoT, DDoS, attacks, botnet, security, agents
Procedia PDF Downloads 1436644 An Investigation Enhancing E-Voting Application Performance
Authors: Aditya Verma
Abstract:
E-voting using blockchain provides us with a distributed system where data is present on each node present in the network and is reliable and secure too due to its immutability property. This work compares various blockchain consensus algorithms used for e-voting applications in the past, based on performance and node scalability, and chooses the optimal one and improves on one such previous implementation by proposing solutions for the loopholes of the optimally working blockchain consensus algorithm, in our chosen application, e-voting.Keywords: blockchain, parallel bft, consensus algorithms, performance
Procedia PDF Downloads 1676643 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby
Authors: Jazim Sohail, Filipe Teixeira-Dias
Abstract:
Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI
Procedia PDF Downloads 2176642 Secondary Metabolites from Turkish Marine-Derived Fungi Hypocrea nigricans
Authors: H. Heydari, B. Konuklugil, P. Proksch
Abstract:
Marine-derived fungi can produce interesting bioactive secondary metabolites that can be considered the potential for drug development. Turkey is a country of a peninsula surrounded by the Black Sea at the north, the Aegean Sea at the west, and the Mediterranean Sea at the south. Despite the approximately 8400 km of coastline, studies on marine secondary metabolites and their biological activity are limited. In our ongoing search for new natural products with different bioactivities produced by the marine-derived fungi, we have investigated secondary metabolites of Turkish collection of the marine sea slug (Peltodoris atromaculata) associated fungi Hypocrea nigricans collected from Seferihisar in the Egean sea. According to the author’s best knowledge, no study was found on this fungal species in terms of secondary metabolites. Isolated from ethyl acetate extract of the culture of Hypocrea nigricans were (isodihydroauroglaucin,tetrahydroauroglaucin and dihydroauroglaucin. The structures of the compounds were established based on an NMR and MS analysis. Structural elucidation of another isolated secondary metabolite/s continues.Keywords: Hypocrea nigricans, isolation, marine fungi, secondary metabolites
Procedia PDF Downloads 1626641 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals
Authors: Christine F. Boos, Fernando M. Azevedo
Abstract:
Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing
Procedia PDF Downloads 5286640 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion
Authors: Ali Kazemi
Abstract:
Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting
Procedia PDF Downloads 666639 Effects of Self-Management Programs on Blood Pressure Control, Self-Efficacy, Medication Adherence, and Body Mass Index among Older Adult Patients with Hypertension: Meta-Analysis of Randomized Controlled Trials
Authors: Van Truong Pham
Abstract:
Background: Self-management was described as a potential strategy for blood pressure control in patients with hypertension. However, the effects of self-management interventions on blood pressure, self-efficacy, medication adherence, and body mass index (BMI) in older adults with hypertension have not been systematically evaluated. We evaluated the effects of self-management interventions on systolic blood pressure (SBP) and diastolic blood pressure (DBP), self-efficacy, medication adherence, and BMI in hypertensive older adults. Methods: We followed the recommended guidelines of preferred reporting items for systematic reviews and meta-analyses. Searches in electronic databases including CINAHL, Cochrane Library, Embase, Ovid-Medline, PubMed, Scopus, Web of Science, and other sources were performed to include all relevant studies up to April 2019. Studies selection, data extraction, and quality assessment were performed by two reviewers independently. We summarized intervention effects as Hedges' g values and 95% confidence intervals (CI) using a random-effects model. Data were analyzed using Comprehensive Meta-Analysis software 2.0. Results: Twelve randomized controlled trials met our inclusion criteria. The results revealed that self-management interventions significantly improved blood pressure control, self-efficacy, medication adherence, whereas the effect of self-management on BMI was not significant in older adult patients with hypertension. The following Hedges' g (effect size) values were obtained: SBP, -0.34 (95% CI, -0.51 to -0.17, p < 0.001); DBP, -0.18 (95% CI, -0.30 to -0.05, p < 0.001); self-efficacy, 0.93 (95%CI, 0.50 to 1.36, p < 0.001); medication adherence, 1.72 (95%CI, 0.44 to 3.00, p=0.008); and BMI, -0.57 (95%CI, -1.62 to 0.48, p = 0.286). Conclusions: Self-management interventions significantly improved blood pressure control, self-efficacy, and medication adherence. However, the effects of self-management on obesity control were not supported by the evidence. Healthcare providers should implement self-management interventions to strengthen patients' role in managing their health care.Keywords: self-management, meta-analysis, blood pressure control, self-efficacy, medication adherence, body mass index
Procedia PDF Downloads 1286638 A Bibliographical Research on the Use of Social Media Websites by the Deaf in Brazil
Authors: Juliana Guimarães Faria
Abstract:
The article focus on social networks and deaf people. It aims to analyze the studies done about this topic published in journals, as well as the ones done through dissertations and theses. It also aims to identify the thematic focus of the studies produced and to identify how the deaf relates to social networks, more specifically, trying to identify, starting with those productions, what are the benefits, or not, of social networks for the deaf and if there is some reflection about the way the deaf community has been organizing politically in search of bilingual education and inclusion, making use of the softwares of social networks. After reading, description and analysis of the eleven works identified about social networks and the deaf, we detected three thematic groups: four studies presented discussions about social networks and the socialization of the deaf; four works presented discussions about the contribution of social networks to the linguistic and cognitive development of the deaf; and three works presented discussions about the political bias of the use of social networks in favor of the deaf. We also identified that the works presented an optimistic view of social networks.Keywords: social networks, deaf, internet, Brazil
Procedia PDF Downloads 4086637 Design, Analysis and Obstacle Avoidance Control of an Electric Wheelchair with Sit-Sleep-Seat Elevation Functions
Authors: Waleed Ahmed, Huang Xiaohua, Wilayat Ali
Abstract:
The wheelchair users are generally exposed to physical and psychological health problems, e.g., pressure sores and pain in the hip joint, associated with seating posture or being inactive in a wheelchair for a long time. Reclining Wheelchair with back, thigh, and leg adjustment helps in daily life activities and health preservation. The seat elevating function of an electric wheelchair allows the user (lower limb amputation) to reach different heights. An electric wheelchair is expected to ease the lives of the elderly and disable people by giving them mobility support and decreasing the percentage of accidents caused by users’ narrow sight or joystick operation errors. Thus, this paper proposed the design, analysis and obstacle avoidance control of an electric wheelchair with sit-sleep-seat elevation functions. A 3D model of a wheelchair is designed in SolidWorks that was later used for multi-body dynamic (MBD) analysis and to verify driving control system. The control system uses the fuzzy algorithm to avoid the obstacle by getting information in the form of distance from the ultrasonic sensor and user-specified direction from the joystick’s operation. The proposed fuzzy driving control system focuses on the direction and velocity of the wheelchair. The wheelchair model has been examined and proven in MSC Adams (Automated Dynamic Analysis of Mechanical Systems). The designed fuzzy control algorithm is implemented on Gazebo robotic 3D simulator using Robotic Operating System (ROS) middleware. The proposed wheelchair design enhanced mobility and quality of life by improving the user’s functional capabilities. Simulation results verify the non-accidental behavior of the electric wheelchair.Keywords: fuzzy logic control, joystick, multi body dynamics, obstacle avoidance, scissor mechanism, sensor
Procedia PDF Downloads 1296636 Expert-Based Validated Measures for Improving Quality Healthcare Services Utilization among Elderly Persons: A Cross-Section Survey
Authors: Uchenna Cosmas Ugwu, Osmond Chukwuemeka Ene
Abstract:
Globally, older adults are considered the most vulnerable groups to age-related diseases including diabetes mellitus, obesity, cardiovascular diseases, cancer and osteoporosis. With improved access to quality healthcare services, these complications can be prevented and the incidence rates reduced to the least occurrence. The aim of this study is to validate appropriate measures for improving quality healthcare services utilization among elderly persons in Nigeria and also to determine the significant association within demographic variables. A cross-sectional survey research design was adopted. Using a convenient sampling technique, a total of 400 experts (150 registered nurses and 250 public health professionals) with minimum of doctoral degree qualification were sampled and studied. A structured instrument titled “Expert-Based Healthcare Services Utilization Questionnaire (EBHSUQ) with .83 reliability index was used for data collection. All the statistical data analysis was completed using frequency counts, percentage scores and chi-square statistics. The results were significant at p≤0.05. It was found that quality healthcare services utilization by elderly persons in Nigeria would be improved if the services are: available (83%), affordable (82%), accessible (79%), suitable (77%), acceptable (77%), continuous (75%) and stress-free (75%). Statistically, significant association existed on quality healthcare services utilization with gender (p=.03<.05) and age (p=.01<.05) while none was observed on work experience (p=.23>.05), marital status (p=.11>.05) and employment category (p=.09>.05). To improve quality healthcare services utilization for elderly persons in Nigeria, the adoption of appropriate measures by Nigerian government and professionals in healthcare sectors are paramount. Therefore, there is need for collaborative efforts by the Nigerian government and healthcare professionals geared towards educating the general public through mass sensitization, awareness campaign, conferences, seminars and workshops for the importance of accessing healthcare services.Keywords: elderly persons, healthcare services, cross-sectional survey research design, utilization.
Procedia PDF Downloads 64