Search results for: manual%20testing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 678

Search results for: manual%20testing

138 Assessment of Impact of Urbanization in High Mountain Urban Watersheds

Authors: D. M. Rey, V. Delgado, J. Zambrano Nájera

Abstract:

Increases in urbanization during XX century, has produced changes in natural dynamics of the basins, which has resulted in increases in runoff volumes, peak flows and flow velocities, that in turn increases flood risk. Higher runoff volumes decrease sewerage networks hydraulic capacity and can cause its failure. This in turn generates increasingly recurrent floods causing mobility problems and general economic detriment in the cities. In Latin America, especially Colombia, this is a major problem because urban population at late XX century was more than 70% is in urban areas increasing approximately in 790% in 1940-1990 period. Besides, high slopes product of Andean topography and high precipitation typical of tropical climates increases velocities and volumes even more, causing stopping of cities during storms. Thus, it becomes very important to know hydrological behavior of Andean Urban Watersheds. This research aims to determine the impact of urbanization in high sloped urban watersheds in its hydrology. To this end, it will be used as study area experimental urban watershed named Palogrande-San Luis watershed, located in the city of Manizales, Colombia. Manizales is a city in central western Colombia, located in Colombian Central Mountain Range (part of Los Andes Mountains) with an abrupt topography (average altitude is 2.153 m). The climate in Manizales is quite uniform, but due to its high altitude it presents high precipitations (1.545 mm/year average) with high humidity (83% average). It was applied HEC-HMS Hydrologic model on the watershed. The inputs to the model were derived from Geographic Information Systems (GIS) theme layers of the Instituto de Estudios Ambientales –IDEA of Universidad Nacional de Colombia, Manizales (Institute of Environmental Studies) and aerial photography taken for the research in conjunction with available literature and look up tables. Rainfall data from a network of 4 rain gages and historical stream flow data were used to calibrate and validate runoff depth using the hydrologic model. Manual calibration was made, and the simulation results show that the model selected is able to characterize the runoff response of the watershed due to land use for urbanization in high mountain watersheds.

Keywords: Andean watersheds modelling, high mountain urban hydrology, urban planning, hydrologic modelling

Procedia PDF Downloads 211
137 A Pilot Study on the Development and Validation of an Instrument to Evaluate Inpatient Beliefs, Expectations and Attitudes toward Reflexology (IBEAR)-16

Authors: Samuel Attias, Elad Schiff, Zahi Arnon, Eran Ben-Arye, Yael Keshet, Ibrahim Matter, Boker Lital Keinan

Abstract:

Background: Despite the extensive use of manual therapies, reflexology in particular, no validated tools have been developed to evaluate patients' beliefs, attitudes and expectations regarding reflexology. Such tools however are essential to improve the results of the reflexology treatment, by better adjusting it to the patients' attitudes and expectations. The tool also enables assessing correlations with clinical results of interventional studies using reflexology. Methods: The IBEAR (Inpatient Beliefs, Expectations and Attitudes toward Reflexology) tool contains 25 questions (8 demographic and 17 specifically addressing reflexology), and was constructed in several stages: brainstorming by a multidisciplinary team of experts; evaluation of each of the proposed questions by the experts' team; and assessment of the experts' degree of agreement per each question, based on a Likert 1-7 scale (1 – don't agree at all; 7 – agree completely). Cronbach's Alpha was computed to evaluate the questionnaire's reliability while the Factor analysis test was used for further validation (228 patients). The questionnaire was tested and re-tested (48h) on a group of 199 patients to assure clarity and reliability, using the Pearson coefficient and the Kappa test. It was modified based on these results into its final form. Results: After its construction, the IBEAR questionnaire passed the expert group's preliminary consensus, evaluation of the questions' clarity (from 5.1 to 7.0), inner validation (from 5.5 to 7) and structural validation (from 5.5 to 6.75). Factor analysis pointed to two content worlds in a division into 4 questions discussing attitudes and expectations versus 5 questions on belief and attitudes. Of the 221 questionnaires collected, a Cronbach's Alpha coefficient was calculated on nine questions relating to beliefs, expectations, and attitudes regarding reflexology. This measure stood at 0.716 (satisfactory reliability). At the Test-Retest stage, 199 research participants filled in the questionnaire a second time. The Pearson coefficient for all questions ranged between 0.73 and 0.94 (good to excellent reliability). As for dichotomic answers, Kappa scores ranged between 0.66 and 1.0 (mediocre to high). One of the questions was removed from the IBEAR following questionnaire validation. Conclusions: The present study provides evidence that the proposed IBEAR-16 questionnaire is a valid and reliable tool for the characterization of potential reflexology patients and may be effectively used in settings which include the evaluation of inpatients' beliefs, expectations, and attitudes toward reflexology.

Keywords: reflexology, attitude, expectation, belief, CAM, inpatient

Procedia PDF Downloads 208
136 Safer Staff: A Survey of Staff Experiences of Violence and Aggression at Work in Coventry and Warwickshire Partnership National Health Service Trust

Authors: Rupinder Kaler, Faith Ndebele, Nadia Saleem, Hafsa Sheikh

Abstract:

Background: Workplace related violence and aggression seems to be considered an acceptable occupational hazard for staff in mental health services. There is literature evidence that healthcare workers in mental health settings are at higher risk from aggression from patients. Aggressive behaviours pose a physical and psychological threat to the psychiatric staff and can result in stress, burnout, sickness, and exhaustion. Further evidence informs that health professionals are the most exposed to psychological disorders such as anxiety, depression and post-traumatic stress disorder. Fear that results from working in a dangerous environment and exhaustion can have a damaging impact on patient care and healthcare relationship. Aim: The aim of this study is to investigate the prevalence and impact of aggressive behaviour on staff working at Coventry and Warwickshire Partnership Trust. Methodology: The study methodology included carrying out a manual, anonymised, multi-disciplinary cross-sectional survey questionnaire across all clinical and non-clinical staff at CWPT from both inpatient and community settings. Findings: The unsurprising finding was that of higher prevalence of aggressive behaviours in in-patients in comparison to community staff. Conclusion: There is a high rate of verbal and physical aggression at work and this has a negative impact on the staff emotional and physical well- being. There is also a higher reliance on colleagues for support on an informal basis than formal organisational support systems. Recommendations: A workforce that is well and functioning is the biggest resource for an organisation. Staff safety during working hours is everyone's responsibility and sits with both individual staff members and the organisation. Post-incident organisational support needs to be consolidated, and hands-on, timely support offered to help maintain emotionally well staff on CWPT. The authors recommend development of preventative and practical protocols for aggression with patient and carer involvement. Post-incident organisational support needs to be consolidated, and hands-on, timely support offered to help maintain emotionally well staff on CWPT.

Keywords: safer staff, survey of staff experiences, violence and aggression, mental health

Procedia PDF Downloads 179
135 Sequential Mixed Methods Study to Examine the Potentiality of Blackboard-Based Collaborative Writing as a Solution Tool for Saudi Undergraduate EFL Students’ Writing Difficulties

Authors: Norah Alosayl

Abstract:

English is considered the most important foreign language in the Kingdom of Saudi Arabia (KSA) because of the usefulness of English as a global language compared to Arabic. As students’ desire to improve their English language skills has grown, English writing has been identified as the most difficult problem for Saudi students in their language learning. Although the English language in Saudi Arabia is taught beginning in the seventh grade, many students have problems at the university level, especially in writing, due to a gap between what is taught in secondary and high schools and university expectations- pupils generally study English at school, based on one book with few exercises in vocabulary and grammar exercises, and there are no specific writing lessons. Moreover, from personal teaching experience at King Saud bin Abdulaziz University, students face real problems with their writing. This paper revolves around the blackboard-based collaborative writing to help the undergraduate Saudi EFL students, in their first year enrolled in two sections of ENGL 101 in the first semester of 2021 at King Saud bin Abdulaziz University, practice the most difficult skill they found in their writing through a small group. Therefore, a sequential mixed methods design will be suited. The first phase of the study aims to highlight the most difficult skill experienced by students from an official writing exam that is evaluated by their teachers through an official rubric used in King Saud bin Abdulaziz University. In the second phase, this study will intend to investigate the benefits of social interaction on the process of learning writing. Students will be provided with five collaborative writing tasks via discussion feature on Blackboard to practice a skill that they found difficult in writing. the tasks will be formed based on social constructivist theory and pedagogic frameworks. The interaction will take place between peers and their teachers. The frequencies of students’ participation and the quality of their interaction will be observed through manual counting, screenshotting. This will help the researcher understand how students actively work on the task through the amount of their participation and will also distinguish the type of interaction (on task, about task, or off-task). Semi-structured interviews will be conducted with students to understand their perceptions about the blackboard-based collaborative writing tasks, and questionnaires will be distributed to identify students’ attitudes with the tasks.

Keywords: writing difficulties, blackboard-based collaborative writing, process of learning writing, interaction, participations

Procedia PDF Downloads 166
134 The Compliance of Safe-Work Behaviors among Undergraduate Nursing Students with Different Clinical Experiences

Authors: K. C. Wong, K. L. Siu, S. N. Ng, K. N. Yip, Y. Y. Yuen, K. W. Lee, K. W. Wong, C. C. Li, H. P. Lam

Abstract:

Background: Occupational injuries among nursing profession were found related to repeated bedside nursing care, such as transfer, lifting and manual handling patients from previous studies. Likewise, undergraduate nursing students are also exposed to potential safety hazard due to their similar work nature of registered nurses. Especially, those students who worked as Temporary undergraduate nursing students (TUNS) which is a part-time clinical job in hospitals in Hong Kong who mainly assisted in providing bedside cares appeared to at high risk of work-related injuries. Several studies suggested the level of compliance with safe work behaviors was highly associated with work-related injuries. Yet, it had been limitedly studied among nursing students. This study was conducted to assess and compare the compliance with safe work behaviors and the levels of awareness of different workplace safety issues between undergraduate nursing students with or without TUNS experiences. Methods: This is a quantitative descriptive study using convenience sampling. 362 undergraduate nursing students in Hong Kong were recruited. The Safe Work Behavior relating to Patient Handling (SWB-PH) was used to assess their compliance of safe-work behaviors and the level of awareness of different workplace safety issues. Results: The results showed that most of the participants (n=250, 69.1%) who were working as TUNS. However, students who worked as TUNS had significantly lower safe-work behaviors compliance (mean SWB-PH score = 3.64±0.54) than those did not worked as TUNS (SWB-PH score=4.21±0.54) (p<0.001). Particularly, these students had higher awareness to seek help and use assistive devices but lower awareness of workplace safety issues and awareness of proper work posture than students without TUNS experiences. The students with TUNS experiences had higher engagement in help-seeking behaviors might be possibly explained by their richer clinical experiences which served as a facilitator to seek help from clinical staff whenever necessary. Besides, these experienced students were more likely to bear risks for occupational injuries and worked alone when no available aid which might be related to the busy working environment, heightened work pressures and high expectations of TUNS. Eventually, students who worked as TUNS might target on completing the assigned tasks and gradually neglecting the occupational safety. Conclusion: The findings contributed to an understanding of the level of compliance with safe work behaviors among nursing students with different clinical experiences. The results might guide the modification of current safety protocols and the introduction of multiple clinical training courses to improve nursing student’s engagement in safe work behaviors.

Keywords: Occupational safety, Safety compliance, Safe-work behavior, Nursing students

Procedia PDF Downloads 118
133 Different Processing Methods to Obtain a Carbon Composite Element for Cycling

Authors: Maria Fonseca, Ana Branco, Joao Graca, Rui Mendes, Pedro Mimoso

Abstract:

The present work is focused on the production of a carbon composite element for cycling through different techniques, namely, blow-molding and high-pressure resin transfer injection (HP-RTM). The main objective of this work is to compare both processes to produce carbon composite elements for the cycling industry. It is well known that the carbon composite components for cycling are produced mainly through blow-molding; however, this technique depends strongly on manual labour, resulting in a time-consuming production process. Comparatively, HP-RTM offers a more automated process which should lead to higher production rates. Nevertheless, a comparison of the elements produced through both techniques must be done, in order to assess if the final products comply with the required standards of the industry. The main difference between said techniques lies in the used material. Blow-moulding uses carbon prepreg (carbon fibres pre-impregnated with a resin system), and the material is laid up by hand, piece by piece, on a mould or on a hard male. After that, the material is cured at a high temperature. On the other hand, in the HP-RTM technique, dry carbon fibres are placed on a mould, and then resin is injected at high pressure. After some research regarding the best material systems (prepregs and braids) and suppliers, an element was designed (similar to a handlebar) to be constructed. The next step was to perform FEM simulations in order to determine what the best layup of the composite material was. The simulations were done for the prepreg material, and the obtained layup was transposed to the braids. The selected material was a prepreg with T700 carbon fibre (24K) and an epoxy resin system, for the blow-molding technique. For HP-RTM, carbon fibre elastic UD tubes and ± 45º braids were used, with both 3K and 6K filaments per tow, and the resin system was an epoxy as well. After the simulations for the prepreg material, the optimized layup was: [45°, -45°,45°, -45°,0°,0°]. For HP-RTM, the transposed layup was [ ± 45° (6k); 0° (6k); partial ± 45° (6k); partial ± 45° (6k); ± 45° (3k); ± 45° (3k)]. The mechanical tests showed that both elements can withstand the maximum load (in this case, 1000 N); however, the one produced through blow-molding can support higher loads (≈1300N against 1100N from HP-RTM). In what concerns to the fibre volume fraction (FVF), the HP-RTM element has a slightly higher value ( > 61% compared to 59% of the blow-molding technique). The optical microscopy has shown that both elements have a low void content. In conclusion, the elements produced using HP-RTM can compare to the ones produced through blow-molding, both in mechanical testing and in the visual aspect. Nevertheless, there is still space for improvement in the HP-RTM elements since the layup of the braids, and UD tubes could be optimized.

Keywords: HP-RTM, carbon composites, cycling, FEM

Procedia PDF Downloads 105
132 Double Wishbone Pushrod Suspension Systems Co-Simulation for Racing Applications

Authors: Suleyman Ogul Ertugrul, Mustafa Turgut, Serkan Inandı, Mustafa Gorkem Coban, Mustafa Kıgılı, Ali Mert, Oguzhan Kesmez, Murat Ozancı, Caglar Uyulan

Abstract:

In high-performance automotive engineering, the realistic simulation of suspension systems is crucial for enhancing vehicle dynamics and handling. This study focuses on the double wishbone suspension system, prevalent in racing vehicles due to its superior control and stability characteristics. Utilizing MATLAB and Adams Car simulation software, we conduct a comprehensive analysis of displacement behaviors and damper sizing under various dynamic conditions. The initial phase involves using MATLAB to simulate the entire suspension system, allowing for the preliminary determination of damper size based on the system's response under simulated conditions. Following this, manual calculations of wheel loads are performed to assess the forces acting on the front and rear suspensions during scenarios such as braking, cornering, maximum vertical loads, and acceleration. Further dynamic force analysis is carried out using MATLAB Simulink, focusing on the interactions between suspension components during key movements such as bumps and rebounds. This simulation helps in formulating precise force equations and in calculating the stiffness of the suspension springs. To enhance the accuracy of our findings, we focus on a detailed kinematic and dynamic analysis. This includes the creation of kinematic loops, derivation of relevant equations, and computation of Jacobian matrices to accurately determine damper travel and compression metrics. The calculated spring stiffness is crucial in selecting appropriate springs to ensure optimal suspension performance. To validate and refine our results, we replicate the analyses using the Adams Car software, renowned for its detailed handling of vehicular dynamics. The goal is to achieve a robust, reliable suspension setup that maximizes performance under the extreme conditions encountered in racing scenarios. This study exemplifies the integration of theoretical mechanics with advanced simulation tools to achieve a high-performance suspension setup that can significantly improve race car performance, providing a methodology that can be adapted for different types of racing vehicles.

Keywords: FSAE, suspension system, Adams Car, kinematic

Procedia PDF Downloads 21
131 On Grammatical Metaphors: A Corpus-Based Reflection on the Academic Texts Written in the Field of Environmental Management

Authors: Masoomeh Estaji, Ahdie Tahamtani

Abstract:

Considering the necessity of conducting research and publishing academic papers during Master’s and Ph.D. programs, graduate students are in dire need of improving their writing skills through either writing courses or self-study planning. One key feature that could aid academic papers to look more sophisticated is the application of grammatical metaphors (GMs). These types of metaphors represent the ‘non-congruent’ and ‘implicit’ ways of decoding meaning through which one grammatical category is replaced by another, more implied counterpart, which can alter the readers’ understanding of the text as well. Although a number of studies have been conducted on the application of GMs across various disciplines, almost none has been devoted to the field of environmental management, and the scope of the previous studies has been relatively limited compared to the present work. In the current study, attempts were made to analyze different types of GMs used in academic papers published in top-tiered journals in the field of environmental management, and make a list of the most frequently used GMs based on their functions in this particular discipline to make the teaching of academic writing courses more explicit and the composition of academic texts more well-structured. To fulfill these purposes, a corpus-based analysis based on the two theoretical models of Martin et al. (1997) and Liardet (2014) was run. Through two stages of manual analysis and concordancers, ten recent academic articles entailing 132490 words published in two prestigious journals were precisely scrutinized. The results yielded that through the whole IMRaD sections of the articles, among all types of ideational GMs, material processes were the most frequent types. The second and the third ranks would apply to the relational and mental categories, respectively. Regarding the use of interpersonal GMs, objective expanding metaphors were the highest in number. In contrast, subjective interpersonal metaphors, either expanding or contracting, were the least significant. This would suggest that scholars in the field of Environmental Management tended to shift the focus on the main procedures and explain technical phenomenon in detail, rather than to compare and contrast other statements and subjective beliefs. Moreover, since no instances of verbal ideational metaphors were detected, it could be deduced that the act of ‘saying or articulating’ something might be against the standards of the academic genre. One other assumption would be that the application of ideational GMs is context-embedded and that the more technical they are, the least frequent they become. For further studies, it is suggested that the employment of GMs to be studied in a wider scope and other disciplines, and the third type of GMs known as ‘textual’ metaphors to be included as well.

Keywords: English for specific purposes, grammatical metaphor, academic texts, corpus-based analysis

Procedia PDF Downloads 146
130 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting

Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero

Abstract:

In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling‎) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.

Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling

Procedia PDF Downloads 110
129 Mechanical Characterization and CNC Rotary Ultrasonic Grinding of Crystal Glass

Authors: Ricardo Torcato, Helder Morais

Abstract:

The manufacture of crystal glass parts is based on obtaining the rough geometry by blowing and/or injection, generally followed by a set of manual finishing operations using cutting and grinding tools. The forming techniques used do not allow the obtainment, with repeatability, of parts with complex shapes and the finishing operations use intensive specialized labor resulting in high cycle times and production costs. This work aims to explore the digital manufacture of crystal glass parts by investigating new subtractive techniques for the automated, flexible finishing of these parts. Finishing operations are essential to respond to customer demands in terms of crystal feel and shine. It is intended to investigate the applicability of different computerized finishing technologies, namely milling and grinding in a CNC machining center with or without ultrasonic assistance, to crystal processing. Research in the field of grinding hard and brittle materials, despite not being extensive, has increased in recent years, and scientific knowledge about the machinability of crystal glass is still very limited. However, it can be said that the unique properties of glass, such as high hardness and very low toughness, make any glass machining technology a very challenging process. This work will measure the performance improvement brought about by the use of ultrasound compared to conventional crystal grinding. This presentation is focused on the mechanical characterization and analysis of the cutting forces in CNC machining of superior crystal glass (Pb ≥ 30%). For the mechanical characterization, the Vickers hardness test provides an estimate of the material hardness (Hv) and the fracture toughness based on cracks that appear in the indentation. Mechanical impulse excitation test estimates the Young’s Modulus, shear modulus and Poisson ratio of the material. For the cutting forces, it a dynamometer was used to measure the forces in the face grinding process. The tests were made based on the Taguchi method to correlate the input parameters (feed rate, tool rotation speed and depth of cut) with the output parameters (surface roughness and cutting forces) to optimize the process (better roughness using the cutting forces that do not compromise the material structure and the tool life) using ANOVA. This study was conducted for conventional grinding and for the ultrasonic grinding process with the same cutting tools. It was possible to determine the optimum cutting parameters for minimum cutting forces and for minimum surface roughness in both grinding processes. Ultrasonic-assisted grinding provides a better surface roughness than conventional grinding.

Keywords: CNC machining, crystal glass, cutting forces, hardness

Procedia PDF Downloads 126
128 Occupational Heat Stress Related Adverse Pregnancy Outcome: A Pilot Study in South India Workplaces

Authors: Rekha S., S. J. Nalini, S. Bhuvana, S. Kanmani, Vidhya Venugopal

Abstract:

Introduction: Pregnant women's occupational heat exposure has been linked to foetal abnormalities and pregnancy complications. The presence of heat in the workplace is expected to lead to Adverse Pregnancy Outcomes (APO), especially in tropical countries where temperatures are rising and workplace cooling interventions are minimal. For effective interventions, in-depth understanding and evidence about occupational heat stress and APO are required. Methodology: Approximately 800 pregnant women in and around Chennai who were employed in jobs requiring moderate to hard labour participated in the cohort research. During the study period (2014-2019), environmental heat exposures were measured using a Questemp WBGT monitor, and heat strain markers, such as Core Body Temperature (CBT) and Urine Specific Gravity (USG), were evaluated using an Infrared Thermometer and a refractometer, respectively. Using a valid HOTHAPS questionnaire, self-reported health symptoms were collected. In addition, a postpartum follow-up with the mothers was done to collect APO-related data. Major findings of the study: Approximately 47.3% of pregnant workers have workplace WBGTs over the safe manual work threshold value for moderate/heavy employment (Average WBGT of 26.6°C±1.0°C). About 12.5% of the workers had CBT levels above the usual range, and 24.8% had USG levels above 1.020, both of which suggested mild dehydration. Miscarriages (3%), stillbirths/preterm births (3.5%), and low birth weights (8.8%) were the most common unfavorable outcomes among pregnant employees. In addition, WBGT exposures above TLVs during all trimesters were associated with a 2.3-fold increased risk of adverse fetal/maternal outcomes (95% CI: 1.4-3.8), after adjusting for potential confounding variables including age, education, socioeconomic status, abortion history, stillbirth, preterm, LBW, and BMI. The study determined that WBGTs in the workplace had direct short- and long-term effects on the health of both the mother and the foetus. Despite the study's limited scope, the findings provided valuable insights and highlighted the need for future comprehensive cohort studies and extensive data in order to establish effective policies to protect vulnerable pregnant women from the dangers of heat stress and to promote reproductive health.

Keywords: adverse outcome, heat stress, interventions, physiological strain, pregnant women

Procedia PDF Downloads 48
127 Exploration of Building Information Modelling Software to Develop Modular Coordination Design Tool for Architects

Authors: Muhammad Khairi bin Sulaiman

Abstract:

The utilization of Building Information Modelling (BIM) in the construction industry has provided an opportunity for designers in the Architecture, Engineering and Construction (AEC) industry to proceed from the conventional method of using manual drafting to a way that creates alternative designs quickly, produces more accurate, reliable and consistent outputs. By using BIM Software, designers can create digital content that manipulates the use of data using the parametric model of BIM. With BIM software, more alternative designs can be created quickly and design problems can be explored further to produce a better design faster than conventional design methods. Generally, BIM is used as a documentation mechanism and has not been fully explored and utilised its capabilities as a design tool. Relative to the current issue, Modular Coordination (MC) design as a sustainable design practice is encouraged since MC design will reduce material wastage through standard dimensioning, pre-fabrication, repetitive, modular construction and components. However, MC design involves a complex process of rules and dimensions. Therefore, a tool is needed to make this process easier. Since the parameters in BIM can easily be manipulated to follow MC rules and dimensioning, thus, the integration of BIM software with MC design is proposed for architects during the design stage. With this tool, there will be an improvement in acceptance and practice in the application of MC design effectively. Consequently, this study will analyse and explore the function and customization of BIM objects and the capability of BIM software to expedite the application of MC design during the design stage for architects. With this application, architects will be able to create building models and locate objects within reference modular grids that adhere to MC rules and dimensions. The parametric modeling capabilities of BIM will also act as a visual tool that will further enhance the automation of the 3-Dimensional space planning modeling process. (Method) The study will first analyze and explore the parametric modeling capabilities of rule-based BIM objects, which eventually customize a reference grid within the rules and dimensioning of MC. Eventually, the approach will further enhance the architect's overall design process and enable architects to automate complex modeling, which was nearly impossible before. A prototype using a residential quarter will be modeled. A set of reference grids guided by specific MC rules and dimensions will be used to develop a variety of space planning and configuration. With the use of the design, the tool will expedite the design process and encourage the use of MC Design in the construction industry.

Keywords: building information modeling, modular coordination, space planning, customization, BIM application, MC space planning

Procedia PDF Downloads 61
126 The Principal-Agent Model with Moral Hazard in the Brazilian Innovation System: The Case of 'Lei do Bem'

Authors: Felippe Clemente, Evaldo Henrique da Silva

Abstract:

The need to adopt some type of industrial policy and innovation in Brazil is a recurring theme in the discussion of public interventions aimed at boosting economic growth. For many years, the country has adopted various policies to change its productive structure in order to increase the participation of sectors that would have the greatest potential to generate innovation and economic growth. Only in the 2000s, tax incentives as a policy to support industrial and technological innovation are being adopted in Brazil as a phenomenon associated with rates of productivity growth and economic development. In this context, in late 2004 and 2005, Brazil reformulated its institutional apparatus for innovation in order to approach the OECD conventions and the Frascati Manual. The Innovation Law (2004) and the 'Lei do Bem' (2005) reduced some institutional barriers to innovation, provided incentives for university-business cooperation, and modified access to tax incentives for innovation. Chapter III of the 'Lei do Bem' (no. 11,196/05) is currently the most comprehensive fiscal incentive to stimulate innovation. It complies with the requirements, which stipulates that the Union should encourage innovation in the company or industry by granting tax incentives. With its introduction, the bureaucratic procedure was simplified by not requiring pre-approval of projects or participation in bidding documents. However, preliminary analysis suggests that this instrument has not yet been able to stimulate the sector diversification of these investments in Brazil, since its benefits are mostly captured by sectors that already developed this activity, thus showing problems with moral hazard. It is necessary, then, to analyze the 'Lei do Bem' to know if there is indeed the need for some change, investigating what changes should be implanted in the Brazilian innovation policy. This work, therefore, shows itself as a first effort to analyze a current national problem, evaluating the effectiveness of the 'Lei do Bem' and suggesting public policies that help and direct the State to the elaboration of legislative laws capable of encouraging agents to follow what they describes. As a preliminary result, it is known that 130 firms used fiscal incentives for innovation in 2006, 320 in 2007 and 552 in 2008. Although this number is on the rise, it is still small, if it is considered that there are around 6 thousand firms that perform Research and Development (R&D) activities in Brazil. Moreover, another obstacle to the 'Lei do Bem' is the percentages of tax incentives provided to companies. These percentages reveal a significant sectoral correlation between R&D expenditures of large companies and R&D expenses of companies that accessed the 'Lei do Bem', reaching a correlation of 95.8% in 2008. With these results, it becomes relevant to investigate the law's ability to stimulate private investments in R&D.

Keywords: brazilian innovation system, moral hazard, R&D, Lei do Bem

Procedia PDF Downloads 312
125 Historical Development of Negative Emotive Intensifiers in Hungarian

Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges

Abstract:

In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.

Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time

Procedia PDF Downloads 203
124 A Semi-Automated GIS-Based Implementation of Slope Angle Design Reconciliation Process at Debswana Jwaneng Mine, Botswana

Authors: K. Mokatse, O. M. Barei, K. Gabanakgosi, P. Matlhabaphiri

Abstract:

The mining of pit slopes is often associated with some level of deviation from design recommendations, and this may translate to associated changes in the stability of the excavated pit slopes. Therefore slope angle design reconciliations are essential for assessing and monitoring compliance of excavated pit slopes to accepted slope designs. These associated changes in slope stability may be reflected by changes in the calculated factors of safety and/or probabilities of failure. Reconciliations of as-mined and slope design profiles are conducted periodically to assess the implications of these deviations on pit slope stability. Currently, the slope design reconciliation process being implemented in Jwaneng Mine involves the measurement of as-mined and design slope angles along vertical sections cut along the established geotechnical design section lines on the GEOVIA GEMS™ software. Bench retentions are calculated as a percentage of the available catchment area, less over-mined and under-mined areas, to that of the designed catchment area. This process has proven to be both tedious and requires a lot of manual effort and time to execute. Consequently, a new semi-automated mine-to-design reconciliation approach that utilizes laser scanning and GIS-based tools is being proposed at Jwaneng Mine. This method involves high-resolution scanning of targeted bench walls, subsequent creation of 3D surfaces from point cloud data and the derivation of slope toe lines and crest lines on the Maptek I-Site Studio software. The toe lines and crest lines are then exported to the ArcGIS software where distance offsets between the design and actual bench toe lines and crest lines are calculated. Retained bench catchment capacity is measured as distances between the toe lines and crest lines on the same bench elevations. The assessment of the performance of the inter-ramp and overall slopes entails the measurement of excavated and design slope angles along vertical sections on the ArcGIS software. Excavated and design toe-to-toe or crest-to-crest slope angles are measured for inter-ramp stack slope reconciliations. Crest-to-toe slope angles are also measured for overall slope angle design reconciliations. The proposed approach allows for a more automated, accurate, quick and easier workflow for carrying out slope angle design reconciliations. This process has proved highly effective and timeous in the assessment of slope performance in Jwaneng Mine. This paper presents a newly proposed process for assessing compliance to slope angle designs for Jwaneng Mine.

Keywords: slope angle designs, slope design recommendations, slope performance, slope stability

Procedia PDF Downloads 201
123 Advanced Techniques in Semiconductor Defect Detection: An Overview of Current Technologies and Future Trends

Authors: Zheng Yuxun

Abstract:

This review critically assesses the advancements and prospective developments in defect detection methodologies within the semiconductor industry, an essential domain that significantly affects the operational efficiency and reliability of electronic components. As semiconductor devices continue to decrease in size and increase in complexity, the precision and efficacy of defect detection strategies become increasingly critical. Tracing the evolution from traditional manual inspections to the adoption of advanced technologies employing automated vision systems, artificial intelligence (AI), and machine learning (ML), the paper highlights the significance of precise defect detection in semiconductor manufacturing by discussing various defect types, such as crystallographic errors, surface anomalies, and chemical impurities, which profoundly influence the functionality and durability of semiconductor devices, underscoring the necessity for their precise identification. The narrative transitions to the technological evolution in defect detection, depicting a shift from rudimentary methods like optical microscopy and basic electronic tests to more sophisticated techniques including electron microscopy, X-ray imaging, and infrared spectroscopy. The incorporation of AI and ML marks a pivotal advancement towards more adaptive, accurate, and expedited defect detection mechanisms. The paper addresses current challenges, particularly the constraints imposed by the diminutive scale of contemporary semiconductor devices, the elevated costs associated with advanced imaging technologies, and the demand for rapid processing that aligns with mass production standards. A critical gap is identified between the capabilities of existing technologies and the industry's requirements, especially concerning scalability and processing velocities. Future research directions are proposed to bridge these gaps, suggesting enhancements in the computational efficiency of AI algorithms, the development of novel materials to improve imaging contrast in defect detection, and the seamless integration of these systems into semiconductor production lines. By offering a synthesis of existing technologies and forecasting upcoming trends, this review aims to foster the dialogue and development of more effective defect detection methods, thereby facilitating the production of more dependable and robust semiconductor devices. This thorough analysis not only elucidates the current technological landscape but also paves the way for forthcoming innovations in semiconductor defect detection.

Keywords: semiconductor defect detection, artificial intelligence in semiconductor manufacturing, machine learning applications, technological evolution in defect analysis

Procedia PDF Downloads 10
122 Radiomics: Approach to Enable Early Diagnosis of Non-Specific Breast Nodules in Contrast-Enhanced Magnetic Resonance Imaging

Authors: N. D'Amico, E. Grossi, B. Colombo, F. Rigiroli, M. Buscema, D. Fazzini, G. Cornalba, S. Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of nodules considered non-specific by expert radiologists, recognized in magnetic resonance mammography (MRm) with T1-weighted (T1w) sequences with paramagnetic contrast. Material and Methods: 47 cases out of 1200 undergoing MRm, in which the MRm assessment gave uncertain classification (non-specific nodules), were admitted to the study. The clinical outcome of the non-specific nodules was later found through follow-up or further exams (biopsy), finding 35 benign and 12 malignant. All MR Images were acquired at 1.5T, a first basal T1w sequence and then four T1w acquisitions after the paramagnetic contrast injection. After a manual segmentation of the lesions, done by a radiologist, and the extraction of 150 radiomic features (30 features per 5 subsequent times) a machine learning (ML) approach was used. An evolutionary algorithm (TWIST system based on KNN algorithm) was used to subdivide the dataset into training and validation test and to select features yielding the maximal amount of information. After this pre-processing, different machine learning systems were applied to develop a predictive model based on a training-testing crossover procedure. 10 cases with a benign nodule (follow-up older than 5 years) and 18 with an evident malignant tumor (clear malignant histological exam) were added to the dataset in order to allow the ML system to better learn from data. Results: NaiveBayes algorithm working on 79 features selected by a TWIST system, resulted to be the best performing ML system with a sensitivity of 96% and a specificity of 78% and a global accuracy of 87% (average values of two training-testing procedures ab-ba). The results showed that in the subset of 47 non-specific nodules, the algorithm predicted the outcome of 45 nodules which an expert radiologist could not identify. Conclusion: In this pilot study we identified a radiomic approach allowing ML systems to perform well in the diagnosis of a non-specific nodule at MR mammography. This algorithm could be a great support for the early diagnosis of malignant breast tumor, in the event the radiologist is not able to identify the kind of lesion and reduces the necessity for long follow-up. Clinical Relevance: This machine learning algorithm could be essential to support the radiologist in early diagnosis of non-specific nodules, in order to avoid strenuous follow-up and painful biopsy for the patient.

Keywords: breast, machine learning, MRI, radiomics

Procedia PDF Downloads 249
121 Enhancement of Fracture Toughness for Low-Temperature Applications in Mild Steel Weldments

Authors: Manjinder Singh, Jasvinder Singh

Abstract:

Existing theories of Titanic/Liberty ship, Sydney bridge accidents and practical experience generated an interest in developing weldments those has high toughness under sub-zero temperature conditions. The purpose was to protect the joint from undergoing DBT (Ductile to brittle transition), when ambient temperature reach sub-zero levels. Metallurgical improvement such as low carbonization or addition of deoxidization elements like Mn and Si was effective to prevent fracture in weldments (crack) at low temperature. In the present research, an attempt has been made to investigate the reason behind ductile to brittle transition of mild steel weldments when subjected to sub-zero temperatures and method of its mitigation. Nickel is added to weldments using manual metal arc welding (MMAW) preventing the DBT, but progressive reduction in charpy impact values as temperature is lowered. The variation in toughness with respect to nickel content being added to the weld pool is analyzed quantitatively to evaluate the rise in toughness value with increasing nickel amount. The impact performance of welded specimens was evaluated by Charpy V-notch impact tests at various temperatures (20 °C, 0 °C, -20 °C, -40 °C, -60 °C). Notch is made in the weldments, as notch sensitive failure is particularly likely to occur at zones of high stress concentration caused by a notch. Then the effect of nickel to weldments is investigated at various temperatures was studied by mechanical and metallurgical tests. It was noted that a large gain in impact toughness could be achieved by adding nickel content. The highest yield strength (462J) in combination with good impact toughness (over 220J at – 60 °C) was achieved with an alloying content of 16 wt. %nickel. Based on metallurgical behavior it was concluded that the weld metals solidify as austenite with increase in nickel. The microstructure was characterized using optical and high resolution SEM (scanning electron microscopy). At inter-dendritic regions mainly martensite was found. In dendrite core regions of the low carbon weld metals a mixture of upper bainite, lower bainite and a novel constituent coalesced bainite formed. Coalesced bainite was characterized by large bainitic ferrite grains with cementite precipitates and is believed to form when the bainite and martensite start temperatures are close to each other. Mechanical properties could be rationalized in terms of micro structural constituents as a function of nickel content.

Keywords: MMAW, Toughness, DBT, Notch, SEM, Coalesced bainite

Procedia PDF Downloads 501
120 Scalable UI Test Automation for Large-scale Web Applications

Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani

Abstract:

This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.

Keywords: aws, elastic container service, scalability, serverless, ui automation test

Procedia PDF Downloads 67
119 Evaluation of a Driver Training Intervention for People on the Autism Spectrum: A Multi-Site Randomized Control Trial

Authors: P. Vindin, R. Cordier, N. J. Wilson, H. Lee

Abstract:

Engagement in community-based activities such as education, employment, and social relationships can improve the quality of life for individuals with Autism Spectrum Disorder (ASD). Community mobility is vital to attaining independence for individuals with ASD. Learning to drive and gaining a driver’s license is a critical link to community mobility; however, for individuals with ASD acquiring safe driving skills can be a challenging process. Issues related to anxiety, executive function, and social communication may affect driving behaviours. Driving training and education aimed at addressing barriers faced by learner drivers with ASD can help them improve their driving performance. A multi-site randomized controlled trial (RCT) was conducted to evaluate the effectiveness of an autism-specific driving training intervention for improving the on-road driving performance of learner drivers with ASD. The intervention was delivered via a training manual and interactive website consisting of five modules covering varying driving environments starting with a focus on off-road preparations and progressing through basic to complex driving skill mastery. Seventy-two learner drivers with ASD aged 16 to 35 were randomized using a blinded group allocation procedure into either the intervention or control group. The intervention group received 10 driving lessons with the instructors trained in the use of an autism-specific driving training protocol, whereas the control group received 10 driving lessons as usual. Learner drivers completed a pre- and post-observation drive using a standardized driving route to measure driving performance using the Driving Performance Checklist (DPC). They also completed anxiety, executive function, and social responsiveness measures. The findings showed that there were significant improvements in driving performance for both the intervention (d = 1.02) and the control group (d = 1.15). However, the differences were not significant between groups (p = 0.614) or study sites (p = 0.842). None of the potential moderator variables (anxiety, cognition, social responsiveness, and driving instructor experience) influenced driving performance. This study is an important step toward improving community mobility for individuals with ASD showing that an autism-specific driving training intervention can improve the driving performance of leaner drivers with ASD. It also highlighted the complexity of conducting a multi-site design even when sites were matched according to geography and traffic conditions. Driving instructors also need more and clearer information on how to communicate with learner drivers with restricted verbal expression.

Keywords: autism spectrum disorder, community mobility, driving training, transportation

Procedia PDF Downloads 105
118 Surgical School Project: Implementation Educational Plan for Adolescents Awaiting Bariatric Surgery

Authors: Brooke Sweeney, David White, Felix Amparano, Nick A. Clark, Amy R. Beck, Mathew Lindquist, Lora Edwards, Julie Vandal, Jennifer Lisondra, Katie Cox, Renee Arensberg, Allen Cummins, Jazmine Cedeno, Jason D. Fraser, Kelsey Dean, Helena H. Laroche, Cristina Fernandez

Abstract:

Background: National organizations call for standardized pre-surgical requirements and education to optimize postoperative outcomes. Since 2017 our surgery program has used defined protocols and educational curricula pre- and post-surgery. In response to patient outcomes, our educational content was refined to include quizzes to assess patient knowledge and surgical preparedness. We aim to optimize adolescent pre-bariatric surgery preparedness by improving overall aggregate pre-surgical assessment performance from 68% to 80% within 12 months. Methods: A multidisciplinary improvement team was developed within the weight management clinic (WMC) of our tertiary care, free-standing children’s hospital. A manual has been utilized since 2017, with limitations in consistent delivery and patient uptake of information. The curriculum has been improved to include quizzes administered during WMC visits prior to bariatric surgery. The initial outcome measure is the pre-surgical quiz score of adolescents preparing for bariatric surgery. Process measure was the number of questions answered correctly to test the questions. Baseline performance was determined by a patient assessment survey of pre-surgical preparedness at patient visits. Plan-Do-Study-Act cycles (PDSA) included: 1) creation and implementation of a refined curriculum, 2) development of 5 new quizzes based upon learning objectives, and 3) improving provider-lead teaching and quiz administration within clinic workflow. Run charts assessed impact over time. Results: A total of 346 quiz questions were administered to 34 adolescents. The outcome measure improved from a baseline mean of 68% to 86% following PDSA 2 cycles, and it was sustained. Conclusion/Implication: Patient/family comprehension of surgical preparedness improved with standardized education via team member-led teaching and assessment using quizzes during pre-surgical clinic visits. The next steps include launching redesigned teaching materials with modules correlated to quizzes and assessment of comprehension and outcomes post-surgically.

Keywords: bariatric surgery, adolescent, clinic, pre-bariatric training

Procedia PDF Downloads 41
117 Impact of the Dog-Technic for D1-D4 and Longitudinal Stroke Technique for Diaphragm on Peak Expiratory Flow (PEF) in Asthmatic Patients

Authors: Victoria Eugenia Garnacho-Garnacho, Elena Sonsoles Rodriguez-Lopez, Raquel Delgado-Delgado, Alvaro Otero-Campos, Jesus Guodemar-Perez, Angelo Michelle Vagali, Juan Pablo Hervas-Perez

Abstract:

Asthma is a heterogeneous disease which has always had a drug treatment. Osteopathic treatment that we propose is aimed, seen through a dorsal manipulation (Dog Technic D1-D4) and a technique for diaphragm (Longitudinal Stroke) forced expiratory flow in spirometry changes there are in particular that there is an increase in the volumes of the Peak Flow and Post intervention and effort and that the application of these two techniques together is more powerful if we applied only a Longitudinal (Stroke). Also rating if this type of treatment will have repercussions on breathlessness, a very common symptom in asthma. And finally to investigate if provided vertebra pain decreased after a manipulation. Methods—Participants were recruited between students and professors of the University, aged 18-65, patients (n = 18) were assigned randomly to one of the two groups, group 1 (longitudinal Stroke and manipulation dorsal Dog Technic) and group 2 (diaphragmatic technique, Longitudinal Stroke). The statistical analysis is characterized by the comparison of the main indicator of obstruction of via area PEF (peak expiratory flow) in various situations through the peak flow meter Datospir Peak-10. The measurements were carried out in four phases: at rest, after the stress test, after the treatment, after treatment and the stress test. After each stress test was evaluated, through the Borg scale, the level of Dyspnea on each patient, regardless of the group. In Group 1 in addition to these parameters was calculated using an algometer spinous pain before and after the manipulation. All data were taken at the minute. Results—12 Group 1 (Dog Technic and Longitudinal Stroke) patients responded positively to treatment, there was an increase of 5.1% and 6.1% of the post-treatment PEF and post-treatment, and effort. The results of the scale of Borg by which we measure the level of Dyspnea were positive, a 54.95%, patients noted an improvement in breathing. In addition was confirmed through the means of both groups group 1 in which two techniques were applied was 34.05% more effective than group 2 in which applied only a. After handling pain fell by 38% of the cases. Conclusions—The impact of the technique of Dog-Technic for D1-D4 and the Longitudinal Stroke technique for diaphragm in the volumes of peak expiratory flow (PEF) in asthmatic patients were positive, there was a change of the PEF Post intervention and post-treatment, and effort and showed the most effective group in which only a technique was applied. Furthermore this type of treatment decreased facilitated vertebrae pain and was efficient in the improvement of Dyspnea and the general well-being of the patient.

Keywords: ANS, asthma, manipulation, manual therapy, osteopathic

Procedia PDF Downloads 264
116 Influencing Factors for Job Satisfaction and Turnover Intention of Surgical Team in the Operating Rooms

Authors: Shu Jiuan Chen, Shu Fen Wu, I. Ling Tsai, Chia Yu Chen, Yen Lin Liu, Chen-Fuh Lam

Abstract:

Background: Increased emotional stress in workplace and depressed job satisfaction may significantly affect the turnover intention and career life of personnel. However, very limited studies have reported the factors influencing the turnover intention of the surgical team members in the operating rooms, where extraordinary stress is normally exit in this isolated medical care unit. Therefore, this study aimed to determine the environmental and personal characteristic factors that might be associated with job satisfaction and turnover intention in the non-physician staff who work in the operating rooms. Methods: This was a cross-sectional, descriptive study performed in a metropolitan teaching hospital in southern Taiwan between May 2017 to July 2017. A structured self-administered questionnaire, modified from the Practice Environment Scale of the Nursing Work Index (PES-NWI), Occupational Stress Indicator-2 (OSI-2) and Maslach Burnout Inventory (MBI) manual was collected from the operating room nurses, nurse anesthetists, surgeon assistants, orderly and other non-physician staff. Numerical and categorical data were analyzed using unpaired t-test and Chi-square test, as appropriate (SPSS, version 20.0). Results: A total of 167 effective questionnaires were collected from 200 eligible, non-physician personnel who worked in the operating room (response rate 83.5%). The overall satisfaction of all responders was 45.64 ± 7.17. In comparison to those who had more than 4-year working experience in the operating rooms, the junior staff ( ≤ 4-year experience) reported to have significantly higher satisfaction in workplace environment and job contentment, as well as lower intention to quit (t = 6.325, P =0.000). Among the different specialties of surgical team members, nurse anesthetists were associated with significantly lower levels of job satisfaction (P=0.043) and intention to stay (x² = 8.127, P < 0.05). Multivariate regression analysis demonstrates job title, seniority, working shifts and job satisfaction are the significant independent predicting factors for quit jobs. Conclusion: The results of this study highlight that increased work seniorities ( > 4-year working experience) are associated with significantly lower job satisfaction, and they are also more likely to leave their current job. Increased workload in supervising the juniors without appropriate job compensation (such as promotions in job title and work shifts) may precipitate their intention to quit. Since the senior staffs are usually the leaders and core members in the operating rooms, the retention of this fundamental manpower is essential to ensure the safety and efficacy of surgical interventions in the operating rooms.

Keywords: surgical team, job satisfaction, resignation intention, operating room

Procedia PDF Downloads 232
115 Detection of Glyphosate Using Disposable Sensors for Fast, Inexpensive and Reliable Measurements by Electrochemical Technique

Authors: Jafar S. Noori, Jan Romano-deGea, Maria Dimaki, John Mortensen, Winnie E. Svendsen

Abstract:

Pesticides have been intensively used in agriculture to control weeds, insects, fungi, and pest. One of the most commonly used pesticides is glyphosate. Glyphosate has the ability to attach to the soil colloids and degraded by the soil microorganisms. As glyphosate led to the appearance of resistant species, the pesticide was used more intensively. As a consequence of the heavy use of glyphosate, residues of this compound are increasingly observed in food and water. Recent studies reported a direct link between glyphosate and chronic effects such as teratogenic, tumorigenic and hepatorenal effects although the exposure was below the lowest regulatory limit. Today, pesticides are detected in water by complicated and costly manual procedures conducted by highly skilled personnel. It can take up to several days to get an answer regarding the pesticide content in water. An alternative to this demanding procedure is offered by electrochemical measuring techniques. Electrochemistry is an emerging technology that has the potential of identifying and quantifying several compounds in few minutes. It is currently not possible to detect glyphosate directly in water samples, and intensive research is underway to enable direct selective and quantitative detection of glyphosate in water. This study focuses on developing and modifying a sensor chip that has the ability to selectively measure glyphosate and minimize the signal interference from other compounds. The sensor is a silicon-based chip that is fabricated in a cleanroom facility with dimensions of 10×20 mm. The chip is comprised of a three-electrode configuration. The deposited electrodes consist of a 20 nm layer chromium and 200 nm gold. The working electrode is 4 mm in diameter. The working electrodes are modified by creating molecularly imprinted polymers (MIP) using electrodeposition technique that allows the chip to selectively measure glyphosate at low concentrations. The modification included using gold nanoparticles with a diameter of 10 nm functionalized with 4-aminothiophenol. This configuration allows the nanoparticles to bind to the working electrode surface and create the template for the glyphosate. The chip was modified using electrodeposition technique. An initial potential for the identification of glyphosate was estimated to be around -0.2 V. The developed sensor was used on 6 different concentrations and it was able to detect glyphosate down to 0.5 mgL⁻¹. This value is below the accepted pesticide limit of 0.7 mgL⁻¹ set by the US regulation. The current focus is to optimize the functionalizing procedure in order to achieve glyphosate detection at the EU regulatory limit of 0.1 µgL⁻¹. To the best of our knowledge, this is the first attempt to modify miniaturized sensor electrodes with functionalized nanoparticles for glyphosate detection.

Keywords: pesticides, glyphosate, rapid, detection, modified, sensor

Procedia PDF Downloads 156
114 A Quality Improvement Approach for Reducing Stigma and Discrimination against Young Key Populations in the Delivery of Sexual Reproductive Health and Rights Services

Authors: Atucungwiire Rwebiita

Abstract:

Introduction: In Uganda, provision of adolescent sexual reproductive health and rights (SRHR) services for key population is still hindered by negative attitudes, stigma and discrimination (S&D) at both the community and facility levels. To address this barrier, Integrated Community Based Initiatives (ICOBI) with support from SIDA is currently implementing a quality improvement (QI) innovative approach for strengthening the capacity of key population (KP) peer leaders and health workers to deliver friendly SRHR services without S&D. Methods: Our innovative approach involves continuous mentorship and coaching of 8 QI teams at 8 health facilities and their catchment areas. Each of the 8 teams (comprised of 5 health workers and 5 KP peer leaders) are facilitated twice a month by two QI Mentors in a 2-hour mentorship session over a period of 4 months. The QI mentors were provided a 2-weeks training on QI approaches for reducing S&D against young key populations in the delivery of SRHR Services. The mentorship sessions are guided by a manual where teams base to analyse root causes of S&D and develop key performance indicators (KPIs) in the 1st and 2nd second sessions respectively. The teams then develop action plans in the 3rd session and review implementation progress on KPIs at the end of subsequent sessions. The KPIs capture information on the attitude of health workers and peer leaders and the general service delivery setting as well as clients’ experience. A dashboard is developed to routinely track the KPIs for S&D across all the supported health facilities and catchment areas. After 4 months, QI teams share documented QI best practices and tested change packages on S&D in a learning and exchange session involving all the teams. Findings: The implementation of this approach is showing positive results. So far, QI teams have already identified the root causes of S&D against key populations including: poor information among health workers, fear of a perceived risk of infection, perceived links between HIV and disreputable behaviour. Others are perceptions that HIV & STIs are divine punishment, sex work and homosexuality are against religion and cultural values. They have also noted the perception that MSM are mentally sick and a danger to everyone. Eight QI teams have developed action plans to address the root causes of S&D. Conclusion: This approach is promising, offers a novel and scalable means to implement stigma-reduction interventions in facility and community settings.

Keywords: key populations, sexual reproductive health and rights, stigma and discrimination , quality improvement approach

Procedia PDF Downloads 132
113 Evaluation of the Energy Performance and Emissions of an Aircraft Engine: J69 Using Fuel Blends of Jet A1 and Biodiesel

Authors: Gabriel Fernando Talero Rojas, Vladimir Silva Leal, Camilo Bayona-Roa, Juan Pava, Mauricio Lopez Gomez

Abstract:

The substitution of conventional aviation fuels with biomass-derived alternative fuels is an emerging field of study in the aviation transport, mainly due to its energy consumption, the contribution to the global Greenhouse Gas - GHG emissions and the fossil fuel price fluctuations. Nevertheless, several challenges remain as the biofuel production cost and its degradative effect over the fuel systems that alter the operating safety. Moreover, experimentation on full-scale aeronautic turbines are expensive and complex, leading to most of the research to the testing of small-size turbojets with a major absence of information regarding the effects in the energy performance and the emissions. The main purpose of the current study is to present the results of experimentation in a full-scale military turbojet engine J69-T-25A (presented in Fig. 1) with 640 kW of power rating and using blends of Jet A1 with oil palm biodiesel. The main findings are related to the thrust specific fuel consumption – TSFC, the engine global efficiency – η, the air/fuel ratio – AFR and the volume fractions of O2, CO2, CO, and HC. Two fuels are used in the present study: a commercial Jet A1 and a Colombian palm oil biodiesel. The experimental plan is conducted using the biodiesel volume contents - w_BD from 0 % (B0) to 50 % (B50). The engine operating regimes are set to Idle, Cruise, and Take-off conditions. The turbojet engine J69 is used by the Colombian Air Force and it is installed in a testing bench with the instrumentation that corresponds to the technical manual of the engine. The increment of w_BD from 0 % to 50 % reduces the η near 3,3 % and the thrust force in a 26,6 % at Idle regime. These variations are related to the reduction of the 〖HHV〗_ad of the fuel blend. The evolved CO and HC tend to be reduced in all the operating conditions when increasing w_BD. Furthermore, a reduction of the atomization angle is presented in Fig. 2, indicating a poor atomization in the fuel nozzle injectors when using a higher biodiesel content as the viscosity of fuel blend increases. An evolution of cloudiness is also observed during the shutdown procedure as presented in Fig. 3a, particularly after 20 % of biodiesel content in the fuel blend. This promotes the contamination of some components of the combustion chamber of the J69 engine with soot and unburned matter (Fig. 3). Thus, the substitution of biodiesel content above 20 % is not recommended in order to avoid a significant decrease of η and the thrust force. A more detail examination of the mechanical wearing of the main components of the engine is advised in further studies.

Keywords: aviation, air to fuel ratio, biodiesel, energy performance, fuel atomization, gas turbine

Procedia PDF Downloads 86
112 Tracing the Developmental Repertoire of the Progressive: Evidence from L2 Construction Learning

Authors: Tianqi Wu, Min Wang

Abstract:

Research investigating language acquisition from a constructionist perspective has demonstrated that language is learned as constructions at various linguistic levels, which is related to factors of frequency, semantic prototypicality, and form-meaning contingency. However, previous research on construction learning tended to focus on clause-level constructions such as verb argument constructions but few attempts were made to study morpheme-level constructions such as the progressive construction, which is regarded as a source of acquisition problems for English learners from diverse L1 backgrounds, especially for those whose L1 do not have an equivalent construction such as German and Chinese. To trace the developmental trajectory of Chinese EFL learners’ use of the progressive with respect to verb frequency, verb-progressive contingency, and verbal prototypicality and generality, a learner corpus consisting of three sub-corpora representing three different English proficiency levels was extracted from the Chinese Learners of English Corpora (CLEC). As the reference point, a native speakers’ corpus extracted from the Louvain Corpus of Native English Essays was also established. All the texts were annotated with C7 tagset by part-of-speech tagging software. After annotation all valid progressive hits were retrieved with AntConc 3.4.3 followed by a manual check. Frequency-related data showed that from the lowest to the highest proficiency level, (1) the type token ratio increased steadily from 23.5% to 35.6%, getting closer to 36.4% in the native speakers’ corpus, indicating a wider use of verbs in the progressive; (2) the normalized entropy value rose from 0.776 to 0.876, working towards the target score of 0.886 in native speakers’ corpus, revealing that upper-intermediate learners exhibited a more even distribution and more productive use of verbs in the progressive; (3) activity verbs (i.e., verbs with prototypical progressive meanings like running and singing) dropped from 59% to 34% but non-prototypical verbs such as state verbs (e.g., being and living) and achievement verbs (e.g., dying and finishing) were increasingly used in the progressive. Apart from raw frequency analyses, collostructional analyses were conducted to quantify verb-progressive contingency and to determine what verbs were distinctively associated with the progressive construction. Results were in line with raw frequency findings, which showed that contingency between the progressive and non-prototypical verbs represented by light verbs (e.g., going, doing, making, and coming) increased as English proficiency proceeded. These findings altogether suggested that beginning Chinese EFL learners were less productive in using the progressive construction: they were constrained by a small set of verbs which had concrete and typical progressive meanings (e.g., the activity verbs). But with English proficiency increasing, their use of the progressive began to spread to marginal members such as the light verbs.

Keywords: Construction learning, Corpus-based, Progressives, Prototype

Procedia PDF Downloads 108
111 Shark Detection and Classification with Deep Learning

Authors: Jeremy Jenrette, Z. Y. C. Liu, Pranav Chimote, Edward Fox, Trevor Hastie, Francesco Ferretti

Abstract:

Suitable shark conservation depends on well-informed population assessments. Direct methods such as scientific surveys and fisheries monitoring are adequate for defining population statuses, but species-specific indices of abundance and distribution coming from these sources are rare for most shark species. We can rapidly fill these information gaps by boosting media-based remote monitoring efforts with machine learning and automation. We created a database of shark images by sourcing 24,546 images covering 219 species of sharks from the web application spark pulse and the social network Instagram. We used object detection to extract shark features and inflate this database to 53,345 images. We packaged object-detection and image classification models into a Shark Detector bundle. We developed the Shark Detector to recognize and classify sharks from videos and images using transfer learning and convolutional neural networks (CNNs). We applied these models to common data-generation approaches of sharks: boosting training datasets, processing baited remote camera footage and online videos, and data-mining Instagram. We examined the accuracy of each model and tested genus and species prediction correctness as a result of training data quantity. The Shark Detector located sharks in baited remote footage and YouTube videos with an average accuracy of 89\%, and classified located subjects to the species level with 69\% accuracy (n =\ eight species). The Shark Detector sorted heterogeneous datasets of images sourced from Instagram with 91\% accuracy and classified species with 70\% accuracy (n =\ 17 species). Data-mining Instagram can inflate training datasets and increase the Shark Detector’s accuracy as well as facilitate archiving of historical and novel shark observations. Base accuracy of genus prediction was 68\% across 25 genera. The average base accuracy of species prediction within each genus class was 85\%. The Shark Detector can classify 45 species. All data-generation methods were processed without manual interaction. As media-based remote monitoring strives to dominate methods for observing sharks in nature, we developed an open-source Shark Detector to facilitate common identification applications. Prediction accuracy of the software pipeline increases as more images are added to the training dataset. We provide public access to the software on our GitHub page.

Keywords: classification, data mining, Instagram, remote monitoring, sharks

Procedia PDF Downloads 91
110 Theta-Phase Gamma-Amplitude Coupling as a Neurophysiological Marker in Neuroleptic-Naive Schizophrenia

Authors: Jun Won Kim

Abstract:

Objective: Theta-phase gamma-amplitude coupling (TGC) was used as a novel evidence-based tool to reflect the dysfunctional cortico-thalamic interaction in patients with schizophrenia. However, to our best knowledge, no studies have reported the diagnostic utility of the TGC in the resting-state electroencephalographic (EEG) of neuroleptic-naive patients with schizophrenia compared to healthy controls. Thus, the purpose of this EEG study was to understand the underlying mechanisms in patients with schizophrenia by comparing the TGC at rest between two groups and to evaluate the diagnostic utility of TGC. Method: The subjects included 90 patients with schizophrenia and 90 healthy controls. All patients were diagnosed with schizophrenia according to the criteria of Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) by two independent psychiatrists using semi-structured clinical interviews. Because patients were either drug-naïve (first episode) or had not been taking psychoactive drugs for one month before the study, we could exclude the influence of medications. Five frequency bands were defined for spectral analyses: delta (1–4 Hz), theta (4–8 Hz), slow alpha (8–10 Hz), fast alpha (10–13.5 Hz), beta (13.5–30 Hz), and gamma (30-80 Hz). The spectral power of the EEG data was calculated with fast Fourier Transformation using the 'spectrogram.m' function of the signal processing toolbox in Matlab. An analysis of covariance (ANCOVA) was performed to compare the TGC results between the groups, which were adjusted using a Bonferroni correction (P < 0.05/19 = 0.0026). Receiver operator characteristic (ROC) analysis was conducted to examine the discriminating ability of the TGC data for schizophrenia diagnosis. Results: The patients with schizophrenia showed a significant increase in the resting-state TGC at all electrodes. The delta, theta, slow alpha, fast alpha, and beta powers showed low accuracies of 62.2%, 58.4%, 56.9%, 60.9%, and 59.0%, respectively, in discriminating the patients with schizophrenia from the healthy controls. The ROC analysis performed on the TGC data generated the most accurate result among the EEG measures, displaying an overall classification accuracy of 92.5%. Conclusion: As TGC includes phase, which contains information about neuronal interactions from the EEG recording, TGC is expected to be useful for understanding the mechanisms the dysfunctional cortico-thalamic interaction in patients with schizophrenia. The resting-state TGC value was increased in the patients with schizophrenia compared to that in the healthy controls and had a higher discriminating ability than the other parameters. These findings may be related to the compensatory hyper-arousal patterns of the dysfunctional default-mode network (DMN) in schizophrenia. Further research exploring the association between TGC and medical or psychiatric conditions that may confound EEG signals will help clarify the potential utility of TGC.

Keywords: quantitative electroencephalography (QEEG), theta-phase gamma-amplitude coupling (TGC), schizophrenia, diagnostic utility

Procedia PDF Downloads 117
109 The Association of Work Stress with Job Satisfaction and Occupational Burnout in Nurse Anesthetists

Authors: I. Ling Tsai, Shu Fen Wu, Chen-Fuh Lam, Chia Yu Chen, Shu Jiuan Chen, Yen Lin Liu

Abstract:

Purpose: Following the conduction of the National Health Insurance (NHI) system in Taiwan since 1995, the demand for anesthesia services continues to increase in the operating rooms and other medical units. It has been well recognized that increased work stress not only affects the clinical performance of the medical staff, long-term work load may also result in occupational burnout. Our study aimed to determine the influence of working environment, work stress and job satisfaction on the occupational burnout in nurse anesthetists. The ultimate goal of this research project is to develop a strategy in establishing a friendly, less stressful workplace for the nurse anesthetists to enhance their job satisfaction, thereby reducing occupational burnout and increasing the career life for nurse anesthetists. Methods: This was a cross-sectional, descriptive study performed in a metropolitan teaching hospital in southern Taiwan between May 2017 to July 2017. A structured self-administered questionnaire, modified from the Practice Environment Scale of the Nursing Work Index (PES-NWI), Occupational Stress Indicator 2 (OSI-2) and Maslach Burnout Inventory (MBI) manual was collected from the nurse anesthetists. The relationships between two numeric datasets were analyzed by the Pearson correlation test (SPSS 20.0). Results: A total of 66 completed questionnaires were collected from 75 nurses (response rate 88%). The average scores for the working environment, job satisfaction, and work stress were 69.6%, 61.5%, and 63.9%, respectively. The three perspectives used to assess the occupational burnout, namely emotional exhaustion, depersonalization and sense of personal accomplishment were 26.3, 13.0 and 24.5, suggesting the presence of moderate to high degrees of burnout in our nurse anesthetists. The presence of occupational burnout was closely correlated with the unsatisfactory working environment (r=-0.385, P=0.001) and reduced job satisfaction (r=-0.430, P=0.000). Junior nurse anesthetists (<1-year clinical experience) reported having higher satisfaction in working environment than the seniors (5 to 10-year clinical experience) (P=0.02). Although the average scores for work stress, job satisfaction, and occupational burnout were lower in junior nurses, the differences were not statistically different. The linear regression model, the working environment was the independent factor that predicted occupational burnout in nurse anesthetists up to 19.8%. Conclusions: High occupational burnout is more likely to develop in senior nurse anesthetists who experienced the dissatisfied working environment, work stress and lower job satisfaction. In addition to the regulation of clinical duties, the increased workload in the supervision of the junior nurse anesthetists may result in emotional stress and burnout in senior nurse anesthetists. Therefore, appropriate adjustment of clinical and teaching loading in the senior nurse anesthetists could be helpful to improve the occupational burnout and enhance the retention rate.

Keywords: nurse anesthetists, working environment, work stress, job satisfaction, occupational burnout

Procedia PDF Downloads 259