Search results for: machine learning tools and techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16695

Search results for: machine learning tools and techniques

7935 Meta-Review of Scholarly Publications on Biosensors: A Bibliometric Study

Authors: Nasrine Olson

Abstract:

With over 70,000 scholarly publications on the topic of biosensors, an overview of the field has become a challenge. To facilitate, there are currently over 700 expert-reviews of publications on biosensors and related topics. This study focuses on these review papers in order to provide a Meta-Review of the area. This paper provides a statistical analysis and overview of biosensor-related review papers. Comprehensive searches are conducted in the Web of Science, and PubMed databases and the resulting empirical material are analyzed using bibliometric methods and tools. The study finds that the biosensor-related review papers can be categorized in five related subgroups, broadly denoted by (i) properties of materials and particles, (ii) analysis and indicators, (iii) diagnostics, (iv) pollutant and analytical devices, and (v) treatment/ application. For an easy and clear access to the findings visualization of clusters and networks of connections are presented. The study includes a temporal dimension and identifies the trends over the years with an emphasis on the most recent developments. This paper provides useful insights for those who wish to form a better understanding of the research trends in the area of biosensors.

Keywords: bibliometrics, biosensors, meta-review, statistical analysis, trends visualization

Procedia PDF Downloads 212
7934 Pawn or Potentates: Corporate Governance Structure in Indian Central Public Sector Enterprises

Authors: Ritika Jain, Rajnish Kumar

Abstract:

The Department of Public Enterprises had made submissions of Self Evaluation Reports, for the purpose of corporate governance, mandatory for all central government owned enterprises. Despite this, an alarming 40% of the enterprises did not do so. This study examines the impact of external policy tools and internal firm-specific factors on corporate governance of central public sector enterprises (CPSEs). We use a dataset of all manufacturing and non-financial services owned by the central government of India for the year 2010-11. Using probit, ordered logit and Heckman’s sample selection models, the study finds that the probability and quality of corporate governance is positively influenced by the CPSE getting into a Memorandum of Understanding (MoU) with the central government of India, and hence, enjoying more autonomy in terms of day to day operations. Besides these, internal factors, including bigger size and lower debt size contribute significantly to better corporate governance.

Keywords: corporate governance, central public sector enterprises (CPSEs), sample selection, Memorandum of Understanding (MoU), ordered logit, disinvestment

Procedia PDF Downloads 253
7933 Realization of a (GIS) for Drilling (DWS) through the Adrar Region

Authors: Djelloul Benatiallah, Ali Benatiallah, Abdelkader Harouz

Abstract:

Geographic Information Systems (GIS) include various methods and computer techniques to model, capture digitally, store, manage, view and analyze. Geographic information systems have the characteristic to appeal to many scientific and technical field, and many methods. In this article we will present a complete and operational geographic information system, following the theoretical principles of data management and adapting to spatial data, especially data concerning the monitoring of drinking water supply wells (DWS) Adrar region. The expected results of this system are firstly an offer consulting standard features, updating and editing beneficiaries and geographical data, on the other hand, provides specific functionality contractors entered data, calculations parameterized and statistics.

Keywords: GIS, DWS, drilling, Adrar

Procedia PDF Downloads 307
7932 Information Technology Governance Implementation and Its Determinants in the Egyptian Market

Authors: Nariman O. Kandil, Ehab K. Abou-Elkheir, Amr M. Kotb

Abstract:

Effective IT governance guarantees the strategic alignment of IT and business goals, risk mitigation control, and better IT and business performance. This study seeks to examine empirically the extent of IT governance implementation within the firms listed on the Egyptian stock exchange (EGX30) and its determinants. Accordingly, 18 semi-structured interviews face to face, phone, and video-conferencing interviews using various tools (e.g., WebEx, Zoom, and Microsoft Teams) were undertaken at the interviewees’ offices in Egypt between the end of November 2019 and the end of August 2020. Results suggest that there are variances in the extent of IT Governance (ITG) implementation within the firms listed on the Egyptian stock exchange (EGX30), mainly caused by the industry type and internal and external triggers. The results also suggest that the organization size, the type of auditor, the criticality of the industry, the effective processes & KPIs, and the information intensity expertise of the CIO have a significant impact on IT governance implementation within the firms.

Keywords: effective IT governance, Egyptian market, information security, risk controls

Procedia PDF Downloads 160
7931 Magnetic Activated Carbon: Preparation, Characterization, and Application for Vanadium Removal

Authors: Hakimeh Sharififard, Mansooreh Soleimani

Abstract:

In this work, the magnetic activated carbon nanocomposite (Fe-CAC) has been synthesized by anchorage iron hydr(oxide) nanoparticles onto commercial activated carbon (CAC) surface and characterized using BET, XRF, SEM techniques. The influence of various removal parameters such as pH, contact time and initial concentration of vanadium on vanadium removal was evaluated using CAC and Fe-CAC in batch method. The sorption isotherms were studied using Langmuir, Freundlich and Dubinin–Radushkevich (D–R) isotherm models. These equilibrium data were well described by the Freundlich model. Results showed that CAC had the vanadium adsorption capacity of 37.87 mg/g, while the Fe-AC was able to adsorb 119.01 mg/g of vanadium. Kinetic data was found to confirm pseudo-second-order kinetic model for both adsorbents.

Keywords: magnetic activated carbon, remove, vanadium, nanocomposite, freundlich

Procedia PDF Downloads 455
7930 Study of Heat Exchangers in Small Modular Reactors

Authors: Harish Aryal, Roger Hague, Daniel Sotelo, Felipe Astete Salinas

Abstract:

This paper presents a comparative study of different coolants, materials, and temperatures that can affect the effectiveness of heat exchangers that are used in small modular reactors. The corrugated plate heat exchangers were chosen out of different plate options for testing purposes because of their ease of access and better performance than other existing heat exchangers in recent years. SolidWorks enables us to see various results between water coolants and helium coolants acting upon different types of conducting metals, which were selected from different fluids that ultimately satisfied accessibility requirements and were compatible with the software. Though not every element, material, fluid, or method was used in the testing phase, their purpose is to help further research that is to come since the innovation of nuclear power is the future. The tests that were performed are to help better understand the constant necessities that are seen in heat exchangers and through every adjustment see what the breaking points or improvements in the machine are. Depending on consumers and researchers, the results may give further feedback as to show why different types of materials and fluids would be preferred and why it is necessary to keep failures to improve future research.

Keywords: heat exchangers, Solidworks, coolants, small modular reactors, nuclear power, nanofluids, Nusselt number, friction factor, Reynolds number

Procedia PDF Downloads 69
7929 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety

Authors: Atheer Al-Nuaimi, Harry Evdorides

Abstract:

Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.

Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety

Procedia PDF Downloads 235
7928 Passive Solar Techniques to Improve Thermal Comfort and Reduce Energy Consumption of Domestic Use

Authors: Naci Kalkan, Ihsan Dagtekin

Abstract:

Passive design responds to improve indoor thermal comfort and minimize the energy consumption. The present research analyzed the how efficiently passive solar technologies generate heating and cooling and provide the system integration for domestic applications. In addition to this, the aim of this study is to increase the efficiency of solar systems system with integration some innovation and optimization. As a result, outputs of the project might start a new sector to provide environmentally friendly and cheap cooling for domestic use.

Keywords: passive solar systems, heating, cooling, thermal comfort, ventilation systems

Procedia PDF Downloads 291
7927 Numerical Analysis of Fire Performance of Timber Structures

Authors: Van Diem Thi, Mourad Khelifa, Mohammed El Ganaoui, Yann Rogaume

Abstract:

An efficient numerical method has been developed to incorporate the effects of heat transfer in timber panels on partition walls exposed to real building fires. The procedure has been added to the software package Abaqus/Standard as a user-defined subroutine (UMATHT) and has been verified using both time-and spatially dependent heat fluxes in two- and three-dimensional problems. The aim is to contribute to the development of simulation tools needed to assist structural engineers and fire testing laboratories in technical assessment exercises. The presented method can also be used under the developmental stages of building components to optimize performance in real fire conditions. The accuracy of the used thermal properties and the finite element models was validated by comparing the predicted results with three different available fire tests in literature. It was found that the model calibrated to results from standard fire conditions provided reasonable predictions of temperatures within assemblies exposed to real building fire.

Keywords: Timber panels, heat transfer, thermal properties, standard fire tests

Procedia PDF Downloads 337
7926 Political Manipulation in Global Discourse

Authors: Gohar Madoyan, Kristine Harutyunyan, Gevorg Barseghyan

Abstract:

It is common knowledge that linguistic manipulation is and has always been a powerful instrument of political discourse. Politicians from different countries and through centuries have successfully used linguistic means to persuade the public. Yet, this persuasion should be linguistically unobtrusive. Small changes in wording may result in a huge difference in perception by the audience. Thus, manipulation is a strategy that is mostly used to convey a certain message to the manipulators, who should be aware of the vulnerabilities of their audience and who must use them to achieve control. Political manipulation, though commonly observed in the 21st century, can easily be traced back to ancient rhetoric, which warns us to choose words carefully while addressing the audience. On the other hand, modern manipulative techniques have become more sophisticated, making use of all scientific advances.

Keywords: manipulators, politics, persuasion, political discourse, linguo-stylistic analysis, rhetoric

Procedia PDF Downloads 75
7925 The Use of Geographically Weighted Regression for Deforestation Analysis: Case Study in Brazilian Cerrado

Authors: Ana Paula Camelo, Keila Sanches

Abstract:

The Geographically Weighted Regression (GWR) was proposed in geography literature to allow relationship in a regression model to vary over space. In Brazil, the agricultural exploitation of the Cerrado Biome is the main cause of deforestation. In this study, we propose a methodology using geostatistical methods to characterize the spatial dependence of deforestation in the Cerrado based on agricultural production indicators. Therefore, it was used the set of exploratory spatial data analysis tools (ESDA) and confirmatory analysis using GWR. It was made the calibration a non-spatial model, evaluation the nature of the regression curve, election of the variables by stepwise process and multicollinearity analysis. After the evaluation of the non-spatial model was processed the spatial-regression model, statistic evaluation of the intercept and verification of its effect on calibration. In an analysis of Spearman’s correlation the results between deforestation and livestock was +0.783 and with soybeans +0.405. The model presented R²=0.936 and showed a strong spatial dependence of agricultural activity of soybeans associated to maize and cotton crops. The GWR is a very effective tool presenting results closer to the reality of deforestation in the Cerrado when compared with other analysis.

Keywords: deforestation, geographically weighted regression, land use, spatial analysis

Procedia PDF Downloads 355
7924 Sensitivity Analysis of Oil Spills Modeling with ADIOS II for Iranian Fields in Persian Gulf

Authors: Farzingohar Mehrnaz, Yasemi Mehran, Esmaili Zinat, Baharlouian Maedeh

Abstract:

Aboozar (Ardeshir) and Bahregansar are the two important Iranian oilfields in Persian Gulf waters. The operation activities cause to create spills which impacted on the marine environment. Assumed spills are molded by ADIOS II (Automated Data Inquiry for Oil Spills) which is NOAA’s weathering oil software. Various atmospheric and marine data with different oil types are used for the modeling. Numerous scenarios for 100 bbls with mean daily air temperature and wind speed are input for 5 days. To find the model sensitivity in each setting, one parameter is changed, but the others stayed constant. In both fields, the evaporated and dispersed output values increased hence the remaining rate is reduced. The results clarified that wind speed first, second air temperature and finally oil type respectively were the most effective factors on the oil weathering process. The obtained results can help the emergency systems to predict the floating (dispersed and remained) volume spill in order to find the suitable cleanup tools and methods.

Keywords: ADIOS, modeling, oil spill, sensitivity analysis

Procedia PDF Downloads 293
7923 Feasibility of Risk Assessment for Type 2 Diabetes in Community Pharmacies Using Two Different Approaches: A Pilot Study in Thailand

Authors: Thitaporn Thoopputra, Tipaporn Pongmesa, Shuchuen Li

Abstract:

Aims: To evaluate the application of non-invasive diabetes risk assessment tool in community pharmacy setting. Methods: Thai diabetes risk score was applied to assess individuals at risk of developing type 2 diabetes. Interactive computer-based risk screening (IT) and paper-based risk screening (PT) tools were applied. Participants aged over 25 years with no known diabetes were recruited in six participating pharmacies. Results: A total of 187 clients, mean aged (+SD) was 48.6 (+10.9) years. 35% were at high risk. The mean value of willingness-to-pay for the service fee in IT group was significantly higher than PT group (p=0.013). No significant difference observed for the satisfaction between groups. Conclusions: Non-invasive risk assessment tool, whether paper-based or computerized-based can be applied in community pharmacy to support the enhancing role of pharmacists in chronic disease management. Long term follow up is needed to determine the impact of its application in clinical, humanistic and economic outcomes.

Keywords: community pharmacy, intervention, prevention, risk assessment, type 2 diabetes

Procedia PDF Downloads 508
7922 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 522
7921 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 130
7920 Cold Spray Fabrication of Coating for Highly Corrosive Environment

Authors: Harminder Singh

Abstract:

Cold spray is a novel and emerging technology for the fabrication of coating. In this study, coating is successfully developed by this process on superalloy surface. The selected coating composition is already proved as corrosion resistant. The microstructure of the newly developed coating is examined by various characterization techniques, for testing its suitability for high temperature corrosive conditions of waste incinerator. The energy producing waste incinerators are still running at low efficiency, mainly due to their chlorine based highly corrosive conditions. The characterization results show that the developed cold sprayed coating structure is suitable for its further testing in highly aggressive conditions.

Keywords: coating, cold spray, corrosion, microstructure

Procedia PDF Downloads 386
7919 The Development of Monk’s Food Bowl Production on Occupational Health Safety and Environment at Work for the Strength of Rattanakosin Local Wisdom

Authors: Thammarak Srimarut, Witthaya Mekhum

Abstract:

This study analysed and developed a model for monk’s food bowl production on occupational health safety and environment at work for the encouragement of Rattanakosin local wisdom at Banbart Community. The process of blowpipe welding was necessary to produce the bowl which was very dangerous or 93.59% risk. After the employment of new sitting posture, the work risk was lower 48.41% or moderate risk. When considering in details, it was found that: 1) the traditional sitting posture could create work risk at 88.89% while the new sitting posture could create the work risk at 58.86%. 2) About the environmental pollution, with the traditional sitting posture, workers exposed to the polluted fume from welding at 61.11% while with the new sitting posture workers exposed to the polluted fume from welding at 40.47%. 3) On accidental risk, with the traditional sitting posture, workers exposed to the accident from welding at 94.44% while with the new sitting posture workers exposed to the accident from welding at 62.54%.

Keywords: occupational health safety, environment at work, Monk’s food bowl, machine intelligence

Procedia PDF Downloads 435
7918 Long-Term Exposure, Health Risk, and Loss of Quality-Adjusted Life Expectancy Assessments for Vinyl Chloride Monomer Workers

Authors: Tzu-Ting Hu, Jung-Der Wang, Ming-Yeng Lin, Jin-Luh Chen, Perng-Jy Tsai

Abstract:

The vinyl chloride monomer (VCM) has been classified as group 1 (human) carcinogen by the IARC. Workers exposed to VCM are known associated with the development of the liver cancer and hence might cause economical and health losses. Particularly, for those work for the petrochemical industry have been seriously concerned in the environmental and occupational health field. Considering assessing workers’ health risks and their resultant economical and health losses requires the establishment of long-term VCM exposure data for any similar exposure group (SEG) of interest, the development of suitable technologies has become an urgent and important issue. In the present study, VCM exposures for petrochemical industry workers were determined firstly based on the database of the 'Workplace Environmental Monitoring Information Systems (WEMIS)' provided by Taiwan OSHA. Considering the existence of miss data, the reconstruction of historical exposure techniques were then used for completing the long-term exposure data for SEGs with routine operations. For SEGs with non-routine operations, exposure modeling techniques, together with their time/activity records, were adopted for determining their long-term exposure concentrations. The Bayesian decision analysis (BDA) was adopted for conducting exposure and health risk assessments for any given SEG in the petrochemical industry. The resultant excessive cancer risk was then used to determine the corresponding loss of quality-adjusted life expectancy (QALE). Results show that low average concentrations can be found for SEGs with routine operations (e.g., VCM rectification 0.0973 ppm, polymerization 0.306 ppm, reaction tank 0.33 ppm, VCM recovery 1.4 ppm, control room 0.14 ppm, VCM storage tanks 0.095 ppm and wastewater treatment 0.390 ppm), and the above values were much lower than that of the permissible exposure limit (PEL; 3 ppm) of VCM promulgated in Taiwan. For non-routine workers, though their high exposure concentrations, their low exposure time and frequencies result in low corresponding health risks. Through the consideration of exposure assessment results, health risk assessment results, and QALE results simultaneously, it is concluded that the proposed method was useful for prioritizing SEGs for conducting exposure abatement measurements. Particularly, the obtained QALE results further indicate the importance of reducing workers’ VCM exposures, though their exposures were low as in comparison with the PEL and the acceptable health risk.

Keywords: exposure assessment, health risk assessment, petrochemical industry, quality-adjusted life years, vinyl chloride monomer

Procedia PDF Downloads 193
7917 A Literature Review on Sustainability Appraisal Methods for Highway Infrastructure Projects

Authors: S. Kaira, S. Mohamed, A. Rahman

Abstract:

Traditionally, highway infrastructure projects are initiated based on their economic benefits, thereafter environmental, social and governance impacts are addressed discretely for the selected project from a set of pre-determined alternatives. When opting for cost-benefit analysis (CBA), multi-criteria decision-making (MCDM) has been used as the default assessment tool. But this tool has been critiqued as it does not mimic the real-world dynamic environment. Indeed, it is because of the fact that public sector projects like highways have to experience intense exposure to dynamic environments. Therefore, it is essential to appreciate the impacts of various dynamic factors (factors that change or progress with the system) on project performance. Thus, this paper presents various sustainability assessment tools that have been globally developed to determine sustainability performance of infrastructure projects during the design, procurement and commissioning phase. Indeed, identification of the current gaps in the available assessment methods provides a potential to add prominent part of knowledge in the field of ‘road project development systems and procedures’ that are generally used by road agencies.

Keywords: dynamic impact factors, micro and macro factors, sustainability assessment framework, sustainability performance

Procedia PDF Downloads 135
7916 Using Bidirectional Encoder Representations from Transformers to Extract Topic-Independent Sentiment Features for Social Media Bot Detection

Authors: Maryam Heidari, James H. Jones Jr.

Abstract:

Millions of online posts about different topics and products are shared on popular social media platforms. One use of this content is to provide crowd-sourced information about a specific topic, event or product. However, this use raises an important question: what percentage of information available through these services is trustworthy? In particular, might some of this information be generated by a machine, i.e., a bot, instead of a human? Bots can be, and often are, purposely designed to generate enough volume to skew an apparent trend or position on a topic, yet the consumer of such content cannot easily distinguish a bot post from a human post. In this paper, we introduce a model for social media bot detection which uses Bidirectional Encoder Representations from Transformers (Google Bert) for sentiment classification of tweets to identify topic-independent features. Our use of a Natural Language Processing approach to derive topic-independent features for our new bot detection model distinguishes this work from previous bot detection models. We achieve 94\% accuracy classifying the contents of data as generated by a bot or a human, where the most accurate prior work achieved accuracy of 92\%.

Keywords: bot detection, natural language processing, neural network, social media

Procedia PDF Downloads 111
7915 Promoting the Contructor's Reputation in the Nigerian Construction Industry

Authors: Abdulkadir Adamu Shehu

Abstract:

Company’s reputation is an elusive asset. The reputation gained by companies must be preserved for sustainability of the company. However, the construction project is still suffering from declination of character due to the factors that affect their reputation. The problem led to the loss of projects, abandoning of the projects and many more. This contributed to negative impact on the contractors in the construction industry. As for today, previous studies have not investigated in this regards yet. For that reason, this paper examines the factors which could promote contractor’s reputation in the construction industry in Nigeria. To achieve this aim, 140 questionnaires were distributed to the Nigerian contractors. Based on the 67% response rate, descriptive analysis and analysis of variance (ANOVA) were the tools applied for the data obtained to be analysed. The result shows that, good communication system and improve quality of output of products are the most significant variables that can promote contractor’s reputation. The homogenous analyses indicate that there are significant different perceptions of respondents in term of the significant effects. The research concluded that contractor’s reputation in construction industry must be maintained and further research was suggested to focus on the qualitative method to have in-depth knowledge on contractor’s reputation in the construction industry.

Keywords: construction industry, contractor’s reputation, effects of delay, Nigeria

Procedia PDF Downloads 426
7914 Method of Nursing Education: History Review

Authors: Cristina Maria Mendoza Sanchez, Maria Angeles Navarro Perán

Abstract:

Introduction: Nursing as a profession, from its initial formation and after its development in practice, has been built and identified mainly from its technical competence and professionalization within the positivist approach of the XIX century that provides a conception of the disease built on the basis of to the biomedical paradigm, where the care provided is more focused on the physiological processes and the disease than on the suffering person understood as a whole. The main issue that is in need of study here is a review of the nursing profession's history to get to know how the nursing profession was before the XIX century. It is unclear if there were organizations or people with knowledge about looking after others or if many people survived by chance. The holistic care, in which the appearance of the disease directly affects all its dimensions: physical, emotional, cognitive, social and spiritual. It is not a concept from the 21st century. It is common practice, most probably since established life in this world, with the final purpose of covering all these perspectives through quality care. Objective: In this paper, we describe and analyze the history of education in nursing learning in terms of reviewing and analysing theoretical foundations of clinical teaching and learning in nursing, with the final purpose of determining and describing the development of the nursing profession along the history. Method: We have done a descriptive systematic review study, doing a systematically searched of manuscripts and articles in the following health science databases: Pubmed, Scopus, Web of Science, Temperamentvm and CINAHL. The selection of articles has been made according to PRISMA criteria, doing a critical reading of the full text using the CASPe method. A compliment to this, we have read a range of historical and contemporary sources to support the review, such as manuals of Florence Nightingale and John of God as primary manuscripts to establish the origin of modern nursing and her professionalization. We have considered and applied ethical considerations of data processing. Results: After applying inclusion and exclusion criteria in our search, in Pubmed, Scopus, Web of Science, Temperamentvm and CINAHL, we have obtained 51 research articles. We have analyzed them in such a way that we have distinguished them by year of publication and the type of study. With the articles obtained, we can see the importance of our background as a profession before modern times in public health and as a review of our past to face challenges in the near future. Discussion: The important influence of key figures other than Nightingale has been overlooked and it emerges that nursing management and development of the professional body has a longer and more complex history than is generally accepted. Conclusions: There is a paucity of studies on the subject of the review to be able to extract very precise evidence and recommendations about nursing before modern times. But even so, as more representative data, an increase in research about nursing history has been observed. In light of the aspects analyzed, the need for new research in the history of nursing emerges from this perspective; in order to germinate studies of the historical construction of care before the XIX century and theories created then. We can assure that pieces of knowledge and ways of care were taught before the XIX century, but they were not called theories, as these concepts were created in modern times.

Keywords: nursing history, nursing theory, Saint John of God, Florence Nightingale, learning, nursing education

Procedia PDF Downloads 103
7913 The Effect of a Theoretical and Practical Training Program on Student Teachers’ Acquisition of Objectivity in Self-Assessments

Authors: Zilungile Sosibo

Abstract:

Constructivism in teacher education is growing tremendously in both the developed and developing world. Proponents of constructivism emphasize active engagement of students in the teaching and learning process. In an effort to keep students engaged while they learn to learn, teachers use a variety of methods to incorporate constructivism in the teaching-learning situations. One area that has a potential for realizing constructivism in the classroom is self-assessment. Sadly, students are rarely involved in the assessment of their work. Instead, the most knowing teacher dominates this process. Student involvement in self-assessments has a potential to teach student teachers to become objective assessors of their students’ work by the time they become credentialed. This is important, as objectivity in assessments is a much-needed skill in the classroom contexts within which teachers deal with students from diverse backgrounds and in which biased assessments should be avoided at all cost. The purpose of the study presented in this paper was to investigate whether student teachers acquired the skills of administering self-assessments objectively after they had been immersed in a formal training program and participated in four sets of self-assessments. The objectives were to determine the extent to which they had mastered the skills of objective self-assessments, their growth and development in this area, and the challenges they encountered in administering self-assessments objectively. The research question was: To what extent did student teachers acquire objectivity in self-assessments after their theoretical and practical engagement in this activity? Data were collected from student teachers through participant observation and semi-structured interviews. The design was a qualitative case study. The sample consisted of 39 final-year student teachers enrolled in a Bachelor of Education teacher education program at a university in South Africa. Results revealed that the formal training program and participation in self-assessments had a minimal effect on students’ acquisition of objectivity in self-assessments, due to the factors associated with self-aggrandizement and hegemony, the latter resulting from gender, religious and racial differences. These results have serious implications for the need to incorporate self-assessments in the teacher-education curriculum, as well as for extended formal training programs for student teachers on assessment in general.

Keywords: objectivity, self-assessment, student teachers, teacher education curriculum

Procedia PDF Downloads 267
7912 Integration of Magnetoresistance Sensor in Microfluidic Chip for Magnetic Particles Detection

Authors: Chao-Ming Su, Pei-Sheng Wu, Yu-Chi Kuo, Yin-Chou Huang, Tan-Yueh Chen, Jefunnie Matahum, Tzong-Rong Ger

Abstract:

Application of magnetic particles (MPs) has been applied in biomedical field for many years. There are lots of advantages through this mediator including high biocompatibility and multi-diversified bio-applications. However, current techniques for evaluating the quantity of the magnetic-labeled sample assays are rare. In this paper, a Wheatstone bridge giant magnetoresistance (GMR) sensor integrated with a homemade detecting system was fabricated and used to quantify the concentration of MPs. The homemade detecting system has shown high detecting sensitivity of 10 μg/μl of MPs with optimized parameter vertical magnetic field 100 G, horizontal magnetic field 2 G and flow rate 0.4 ml/min.

Keywords: magnetic particles, magnetoresistive sensors, microfluidics, biosensor

Procedia PDF Downloads 396
7911 Bioeconomic Modeling for the Sustainable Exploitation of Three Key Marine Species in Morocco

Authors: I. Ait El Harch, K. Outaaoui, Y. El Foutayeni

Abstract:

This study aims to deepen the understanding and optimize fishing activity in Morocco by holistically integrating biological and economic aspects. We develop a biological equilibrium model in which these competing species present their natural growth by logistic equations, taking into account density and competition between them. The integration of human intervention adds a realistic dimension to our model. A company specifically targets the three species, thus influencing population dynamics according to their fishing activities. The aim of this work is to determine the fishing effort that maximizes the company’s profit, taking into account the constraints associated with conserving ecosystem equilibrium.

Keywords: bioeconomical modeling, optimization techniques, linear complementarity problem LCP, biological equilibrium, maximizing profits

Procedia PDF Downloads 11
7910 Cost-Effective Mechatronic Gaming Device for Post-Stroke Hand Rehabilitation

Authors: A. Raj Kumar, S. Bilaloglu

Abstract:

Stroke is a leading cause of adult disability worldwide. We depend on our hands for our activities of daily living(ADL). Although many patients regain the ability to walk, they continue to experience long-term hand motor impairments. As the number of individuals with young stroke is increasing, there is a critical need for effective approaches for rehabilitation of hand function post-stroke. Motor relearning for dexterity requires task-specific kinesthetic, tactile and visual feedback. However, when a stroke results in both sensory and motor impairment, it becomes difficult to ascertain when and what type of sensory substitutions can facilitate motor relearning. In an ideal situation, real-time task-specific data on the ability to learn and data-driven feedback to assist such learning will greatly assist rehabilitation for dexterity. We have found that kinesthetic and tactile information from the unaffected hand can assist patients re-learn the use of optimal fingertip forces during a grasp and lift task. Measurement of fingertip grip force (GF), load forces (LF), their corresponding rates (GFR and LFR), and other metrics can be used to gauge the impairment level and progress during learning. Currently ATI mini force-torque sensors are used in research settings to measure and compute the LF, GF, and their rates while grasping objects of different weights and textures. Use of the ATI sensor is cost prohibitive for deployment in clinical or at-home rehabilitation. A cost effective mechatronic device is developed to quantify GF, LF, and their rates for stroke rehabilitation purposes using off-the-shelf components such as load cells, flexi-force sensors, and an Arduino UNO microcontroller. A salient feature of the device is its integration with an interactive gaming environment to render a highly engaging user experience. This paper elaborates the integration of kinesthetic and tactile sensing through computation of LF, GF and their corresponding rates in real time, information processing, and interactive interfacing through augmented reality for visual feedback.

Keywords: feedback, gaming, kinesthetic, rehabilitation, tactile

Procedia PDF Downloads 238
7909 Teaching Linguistic Humour Research Theories: Egyptian Higher Education EFL Literature Classes

Authors: O. F. Elkommos

Abstract:

“Humour studies” is an interdisciplinary research area that is relatively recent. It interests researchers from the disciplines of psychology, sociology, medicine, nursing, in the work place, gender studies, among others, and certainly teaching, language learning, linguistics, and literature. Linguistic theories of humour research are numerous; some of which are of interest to the present study. In spite of the fact that humour courses are now taught in universities around the world in the Egyptian context it is not included. The purpose of the present study is two-fold: to review the state of arts and to show how linguistic theories of humour can be possibly used as an art and craft of teaching and of learning in EFL literature classes. In the present study linguistic theories of humour were applied to selected literary texts to interpret humour as an intrinsic artistic communicative competence challenge. Humour in the area of linguistics was seen as a fifth component of communicative competence of the second language leaner. In literature it was studied as satire, irony, wit, or comedy. Linguistic theories of humour now describe its linguistic structure, mechanism, function, and linguistic deviance. Semantic Script Theory of Verbal Humor (SSTH), General Theory of Verbal Humor (GTVH), Audience Based Theory of Humor (ABTH), and their extensions and subcategories as well as the pragmatic perspective were employed in the analyses. This research analysed the linguistic semantic structure of humour, its mechanism, and how the audience reader (teacher or learner) becomes an interactive interpreter of the humour. This promotes humour competence together with the linguistic, social, cultural, and discourse communicative competence. Studying humour as part of the literary texts and the perception of its function in the work also brings its positive association in class for educational purposes. Humour is by default a provoking/laughter-generated device. Incongruity recognition, perception and resolving it, is a cognitive mastery. This cognitive process involves a humour experience that lightens up the classroom and the mind. It establishes connections necessary for the learning process. In this context the study examined selected narratives to exemplify the application of the theories. It is, therefore, recommended that the theories would be taught and applied to literary texts for a better understanding of the language. Students will then develop their language competence. Teachers in EFL/ESL classes will teach the theories, assist students apply them and interpret text and in the process will also use humour. This is thus easing students' acquisition of the second language, making the classroom an enjoyable, cheerful, self-assuring, and self-illuminating experience for both themselves and their students. It is further recommended that courses of humour research studies should become an integral part of higher education curricula in Egypt.

Keywords: ABTH, deviance, disjuncture, episodic, GTVH, humour competence, humour comprehension, humour in the classroom, humour in the literary texts, humour research linguistic theories, incongruity-resolution, isotopy-disjunction, jab line, longer text joke, narrative story line (macro-micro), punch line, six knowledge resource, SSTH, stacks, strands, teaching linguistics, teaching literature, TEFL, TESL

Procedia PDF Downloads 297
7908 Urdu Text Extraction Method from Images

Authors: Samabia Tehsin, Sumaira Kausar

Abstract:

Due to the vast increase in the multimedia data in recent years, efficient and robust retrieval techniques are needed to retrieve and index images/ videos. Text embedded in the images can serve as the strong retrieval tool for images. This is the reason that text extraction is an area of research with increasing attention. English text extraction is the focus of many researchers but very less work has been done on other languages like Urdu. This paper is focusing on Urdu text extraction from video frames. This paper presents a text detection feature set, which has the ability to deal up with most of the problems connected with the text extraction process. To test the validity of the method, it is tested on Urdu news dataset, which gives promising results.

Keywords: caption text, content-based image retrieval, document analysis, text extraction

Procedia PDF Downloads 509
7907 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases

Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo

Abstract:

The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.

Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis

Procedia PDF Downloads 215
7906 System Identification and Quantitative Feedback Theory Design of a Lathe Spindle

Authors: M. Khairudin

Abstract:

This paper investigates the system identification and design quantitative feedback theory (QFT) for the robust control of a lathe spindle. The dynamic of the lathe spindle is uncertain and time variation due to the deepness variation on cutting process. System identification was used to obtain the dynamics model of the lathe spindle. In this work, real time system identification is used to construct a linear model of the system from the nonlinear system. These linear models and its uncertainty bound can then be used for controller synthesis. The real time nonlinear system identification process to obtain a set of linear models of the lathe spindle that represents the operating ranges of the dynamic system. With a selected input signal, the data of output and response is acquired and nonlinear system identification is performed using Matlab to obtain a linear model of the system. Practical design steps are presented in which the QFT-based conditions are formulated to obtain a compensator and pre-filter to control the lathe spindle. The performances of the proposed controller are evaluated in terms of velocity responses of the the lathe machine spindle in corporating deepness on cutting process.

Keywords: lathe spindle, QFT, robust control, system identification

Procedia PDF Downloads 539