Search results for: psychotherapy techniques
5445 Teaching Turn-Taking Rules and Pragmatic Principles to Empower EFL Students and Enhance Their Learning in Speaking Modules
Authors: O. F. Elkommos
Abstract:
Teaching and learning EFL speaking modules is one of the most challenging productive modules for both instructors and learners. In a student-centered interactive communicative language teaching approach, learners and instructors should be aware of the fact that the target language must be taught as/for communication. The student must be empowered by tools that will work on more than one level of their communicative competence. Communicative learning will need a teaching and learning methodology that will address the goal. Teaching turn-taking rules, pragmatic principles and speech acts will enhance students' sociolinguistic competence, strategic competence together with discourse competence. Sociolinguistic competence entails the mastering of speech act conventions and illocutionary acts of refusing, agreeing/disagreeing; emotive acts like, thanking, apologizing, inviting, offering; directives like, ordering, requesting, advising, and hinting, among others. Strategic competence includes enlightening students’ consciousness of the various particular turn-taking systemic rules of organizing techniques of opening and closing conversation, adjacency pairs, interrupting, back-channeling, asking for/giving opinion, agreeing/disagreeing, using natural fillers for pauses, gaps, speaker select, self-select, and silence among others. Students will have the tools to manage a conversation. Students are engaged in opportunities of experiencing the natural language not as a mere extra student talking time but rather an empowerment of knowing and using the strategies. They will have the component items they need to use as well as the opportunity to communicate in the target language using topics of their interest and choice. This enhances students' communicative abilities. Available websites and textbooks now use one or more of these tools of turn-taking or pragmatics. These will be students' support in self-study in their independent learning study hours. This will be their reinforcement practice on e-Learning interactive activities. The students' target is to be able to communicate the intended meaning to an addressee that is in turn able to infer that intended meaning. The combination of these tools will be assertive and encouraging to the student to beat the struggle with what to say, how to say it, and when to say it. Teaching the rules, principles and techniques is an act of awareness raising method engaging students in activities that will lead to their pragmatic discourse competence. The aim of the paper is to show how the suggested pragmatic model will empower students with tools and systems that would support their learning. Supporting students with turn taking rules, speech act theory, applying both to texts and practical analysis and using it in speaking classes empowers students’ pragmatic discourse competence and assists them to understand language and its context. They become more spontaneous and ready to learn the discourse pragmatic dimension of the speaking techniques and suitable content. Students showed a better performance and a good motivation to learn. The model is therefore suggested for speaking modules in EFL classes.Keywords: communicative competence, EFL, empowering learners, enhance learning, speech acts, teaching speaking, turn taking, learner centred, pragmatics
Procedia PDF Downloads 1755444 An Investigation on the Pulse Electrodeposition of Ni-TiO2/TiO2 Multilayer Structures
Authors: S. Mohajeri
Abstract:
Electrocodeposition of Ni-TiO2 nanocomposite single layers and Ni-TiO2/TiO2 multilayers from Watts bath containing TiO2 sol was carried out on copper substrate. Pulse plating and pulse reverse plating techniques were applied to facilitate higher incorporations of TiO2 nanoparticles in Ni-TiO2 nanocomposite single layers, and the results revealed that by prolongation of the current-off durations and the anodic cycles, deposits containing 11.58 wt.% and 13.16 wt.% TiO2 were produced, respectively. Multilayer coatings which consisted of Ni-TiO2 and TiO2-rich layers were deposited by pulse potential deposition through limiting the nickel deposition by diffusion control mechanism. The TiO2-rich layers thickness and accordingly, the content of TiO2 reinforcement reached 104 nm and 18.47 wt.%, respectively in the optimum condition. The phase structure and surface morphology of the nanocomposite coatings were characterized by X-ray diffraction (XRD) and scanning electron microscopy (SEM). The cross sectional morphology and line scans of the layers were studied by field emission scanning electron microscopy (FESEM). It was confirmed that the preferred orientations and the crystallite sizes of nickel matrix were influenced by the deposition technique parameters, and higher contents of codeposited TiO2 nanoparticles refined the microstructure. The corrosion behavior of the coatings in 1M NaCl and 0.5M H2SO4 electrolytes were compared by means of potentiodynamic polarization and electrochemical impedance spectroscopy (EIS) techniques. Increase of corrosion resistance and the passivation tendency were favored by TiO2 incorporation, while the degree of passivation declined as embedded particles disturbed the continuity of passive layer. The role of TiO2 incorporation on the improvement of mechanical properties including hardness, elasticity, scratch resistance and friction coefficient was investigated by the means of atomic force microscopy (AFM). Hydrophilicity and wettability of the composite coatings were investigated under UV illumination, and the water contact angle of the multilayer was reduced to 7.23° after 1 hour of UV irradiation.Keywords: electrodeposition, hydrophilicity, multilayer, pulse-plating
Procedia PDF Downloads 2475443 Analyzing Impacts of Road Network on Vegetation Using Geographic Information System and Remote Sensing Techniques
Authors: Elizabeth Malebogo Mosepele
Abstract:
Road transport has become increasingly common in the world; people rely on road networks for transportation purpose on a daily basis. However, environmental impact of roads on surrounding landscapes extends their potential effects even further. This study investigates the impact of road network on natural vegetation. The study will provide baseline knowledge regarding roadside vegetation and would be helpful in future for conservation of biodiversity along the road verges and improvements of road verges. The general hypothesis of this study is that the amount and condition of road side vegetation could be explained by road network conditions. Remote sensing techniques were used to analyze vegetation conditions. Landsat 8 OLI image was used to assess vegetation cover condition. NDVI image was generated and used as a base from which land cover classes were extracted, comprising four categories viz. healthy vegetation, degraded vegetation, bare surface, and water. The classification of the image was achieved using the supervised classification technique. Road networks were digitized from Google Earth. For observed data, transect based quadrats of 50*50 m were conducted next to road segments for vegetation assessment. Vegetation condition was related to road network, with the multinomial logistic regression confirming a significant relationship between vegetation condition and road network. The null hypothesis formulated was that 'there is no variation in vegetation condition as we move away from the road.' Analysis of vegetation condition revealed degraded vegetation within close proximity of a road segment and healthy vegetation as the distance increase away from the road. The Chi Squared value was compared with critical value of 3.84, at the significance level of 0.05 to determine the significance of relationship. Given that the Chi squared value was 395, 5004, the null hypothesis was therefore rejected; there is significant variation in vegetation the distance increases away from the road. The conclusion is that the road network plays an important role in the condition of vegetation.Keywords: Chi squared, geographic information system, multinomial logistic regression, remote sensing, road side vegetation
Procedia PDF Downloads 4325442 Spatial Analysis of the Impact of City Developments Degradation of Green Space in Urban Fringe Eastern City of Yogyakarta Year 2005-2010
Authors: Pebri Nurhayati, Rozanah Ahlam Fadiyah
Abstract:
In the development of the city often use rural areas that can not be separated from the change in land use that lead to the degradation of urban green space in the city fringe. In the long run, the degradation of green open space this can impact on the decline of ecological, psychological and public health. Therefore, this research aims to (1) determine the relationship between the parameters of the degradation rate of urban development with green space, (2) develop a spatial model of the impact of urban development on the degradation of green open space with remote sensing techniques and Geographical Information Systems in an integrated manner. This research is a descriptive research with data collection techniques of observation and secondary data . In the data analysis, to interpret the direction of urban development and degradation of green open space is required in 2005-2010 ASTER image with NDVI. Of interpretation will generate two maps, namely maps and map development built land degradation green open space. Secondary data related to the rate of population growth, the level of accessibility, and the main activities of each city map is processed into a population growth rate, the level of accessibility maps, and map the main activities of the town. Each map is used as a parameter to map the degradation of green space and analyzed by non-parametric statistical analysis using Crosstab thus obtained value of C (coefficient contingency). C values were then compared with the Cmaximum to determine the relationship. From this research will be obtained in the form of modeling spatial map of the City Development Impact Degradation Green Space in Urban Fringe eastern city of Yogyakarta 2005-2010. In addition, this research also generate statistical analysis of the test results of each parameter to the degradation of green open space in the Urban Fringe eastern city of Yogyakarta 2005-2010.Keywords: spatial analysis, urban development, degradation of green space, urban fringe
Procedia PDF Downloads 3125441 Study of the ZnO Effect on the Properties of HDPE/ ZnO Nanocomposites
Authors: F. Z. Benabid, F. Zouai, N. Kharchi, D. Benachour
Abstract:
A HDPE/ZnO nano composites have been successfully performed using the co-mixing. The ZnO was first co-mixed with the stearic acid then added to the polymer in the plastograph. The nano composites prepared with the co-mixed ZnO were compared to those prepared with the neat TiO2. The nano composites were characterized by different techniques as the wide-angle X-ray scattering (WAXS). The micro and nano structure/properties relationships were investigated. The present study allowed establishing good correlations between the different measured properties.Keywords: exfoliation, ZnO, nano composites, HDPE, co-mixing
Procedia PDF Downloads 3495440 Implementation and Challenges of Assessment Methods in the Case of Physical Education Class in Some Selected Preparatory Schools of Kirkos Sub-City
Authors: Kibreab Alene Fenite
Abstract:
The purpose of this study is to investigate the implementation and challenges of different assessment methods for physical education class in some selected preparatory schools of kirkos sub city. The participants in this study are teachers, students, department heads and school principals from 4 selected schools. Of the total 8 schools offering in kirkos sub city 4 schools (Dandi Boru, Abiyot Kirse, Assay, and Adey Ababa) are selected by using simple random sampling techniques and from these schools all (100%) of teachers, 100% of department heads and school principals are taken as a sample as their number is manageable. From the total 2520 students, 252 (10%) of students are selected using simple random sampling. Accordingly, 13 teachers, 252 students, 4 department heads and 4 school principals are taken as a sample from the 4 selected schools purposefully. As a method of data gathering tools; questionnaire and interview are employed. To analyze the collected data, both quantitative and qualitative methods are used. The result of the study revealed that assessment in physical education does not implement properly: lack of sufficient materials, inadequate time allotment, large class size, and lack of collaboration and working together of teachers towards assessing the performance of students, absence of guidelines to assess the physical education subject, no different assessment method that is implementing on students with disabilities in line with their special need are found as major challenges in implementing the current assessment method of physical education. To overcome these problems the following recommendations have been forwarded. These are: the necessary facilities and equipment should be available; In order to make reliable, accurate, objective and relevant assessment, teachers of physical education should be familiarized with different assessment techniques; Physical education assessment guidelines should be prepared, and guidelines should include different types of assessment methods; qualified teachers should be employed, and different teaching room must be build.Keywords: assessment, challenges, equipment, guidelines, implementation, performance
Procedia PDF Downloads 2795439 Structural Properties of Surface Modified PVA: Zn97Pr3O Polymer Nanocomposite Free Standing Films
Authors: Pandiyarajan Thangaraj, Mangalaraja Ramalinga Viswanathan, Karthikeyan Balasubramanian, Héctor D. Mansilla, José Ruiz
Abstract:
Rare earth ions doped semiconductor nanostructures gained much attention due to their novel physical and chemical properties which lead to potential applications in laser technology as inexpensive luminescent materials. Doping of rare earth ions into ZnO semiconductor alter its electronic structure and emission properties. Surface modification (polymer covering) is one of the simplest techniques to modify the emission characteristics of host materials. The present work reports the synthesis and structural properties of PVA:Zn97Pr3O polymer nanocomposite free standing films. To prepare Pr3+ doped ZnO nanostructures and PVA:Zn97Pr3O polymer nanocomposite free standing films, the colloidal chemical and solution casting techniques were adopted, respectively. The formation of PVA:Zn97Pr3O films were confirmed through X-ray diffraction (XRD), absorption and Fourier transform infrared (FTIR) spectroscopy analyses. XRD measurements confirm the prepared materials are crystalline having hexagonal wurtzite structure. Polymer composite film exhibits the diffraction peaks of both PVA and ZnO structures. TEM images reveal the pure and Pr3+ doped ZnO nanostructures exhibit sheet like morphology. Optical absorption spectra show free excitonic absorption band of ZnO at 370 nm and, the PVA:Zn97Pr3O polymer film shows absorption bands at ~282 and 368 nm and these arise due to the presence of carbonyl containing structures connected to the PVA polymeric chains, mainly at the ends and free excitonic absorption of ZnO nanostructures, respectively. Transmission spectrum of as prepared film shows 57 to 69% of transparency in the visible and near IR region. FTIR spectral studies confirm the presence of A1 (TO) and E1 (TO) modes of Zn-O bond vibration and the formation of polymer composite materials.Keywords: rare earth doped ZnO, polymer composites, structural characterization, surface modification
Procedia PDF Downloads 3615438 Indian Premier League (IPL) Score Prediction: Comparative Analysis of Machine Learning Models
Authors: Rohini Hariharan, Yazhini R, Bhamidipati Naga Shrikarti
Abstract:
In the realm of cricket, particularly within the context of the Indian Premier League (IPL), the ability to predict team scores accurately holds significant importance for both cricket enthusiasts and stakeholders alike. This paper presents a comprehensive study on IPL score prediction utilizing various machine learning algorithms, including Support Vector Machines (SVM), XGBoost, Multiple Regression, Linear Regression, K-nearest neighbors (KNN), and Random Forest. Through meticulous data preprocessing, feature engineering, and model selection, we aimed to develop a robust predictive framework capable of forecasting team scores with high precision. Our experimentation involved the analysis of historical IPL match data encompassing diverse match and player statistics. Leveraging this data, we employed state-of-the-art machine learning techniques to train and evaluate the performance of each model. Notably, Multiple Regression emerged as the top-performing algorithm, achieving an impressive accuracy of 77.19% and a precision of 54.05% (within a threshold of +/- 10 runs). This research contributes to the advancement of sports analytics by demonstrating the efficacy of machine learning in predicting IPL team scores. The findings underscore the potential of advanced predictive modeling techniques to provide valuable insights for cricket enthusiasts, team management, and betting agencies. Additionally, this study serves as a benchmark for future research endeavors aimed at enhancing the accuracy and interpretability of IPL score prediction models.Keywords: indian premier league (IPL), cricket, score prediction, machine learning, support vector machines (SVM), xgboost, multiple regression, linear regression, k-nearest neighbors (KNN), random forest, sports analytics
Procedia PDF Downloads 515437 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 2685436 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance
Authors: Sokkhey Phauk, Takeo Okazaki
Abstract:
The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance
Procedia PDF Downloads 1045435 Comparative Analysis of the Performance Between Public and Private Companies: Explanatory Factors
Authors: Atziri Moreno Vite, David Silva Gutiérrez
Abstract:
Oil companies have become the key player in the world energy scenario thanks to their strong control of the level of hydrocarbon reserves and production. The present research aims to identify the main factors that explain the results of these companies through an in-depth review of the specialized literature and to analyze the results of these companies by means of econometric analysis with techniques such as Data Envelopment Analysis (DEA). The results show the relevance and impact of factors such as the level of employment or investment of the company.Keywords: oil companies, performance, determinants, productive
Procedia PDF Downloads 1215434 Study of Land Use Changes around an Archaeological Site Using Satellite Imagery Analysis: A Case Study of Hathnora, Madhya Pradesh, India
Authors: Pranita Shivankar, Arun Suryawanshi, Prabodhachandra Deshmukh, S. V. C. Kameswara Rao
Abstract:
Many undesirable significant changes in landscapes and the regions in the vicinity of historically important structures occur as impacts due to anthropogenic activities over a period of time. A better understanding of such influences using recently developed satellite remote sensing techniques helps in planning the strategies for minimizing the negative impacts on the existing environment. In 1982, a fossilized hominid skull cap was discovered at a site located along the northern bank of the east-west flowing river Narmada in the village Hathnora. Close to the same site, the presence of Late Acheulian and Middle Palaeolithic tools have been discovered in the immediately overlying pebbly gravel, suggesting that the ‘Narmada skull’ may be from the Middle Pleistocene age. The reviews of recently carried out research studies relevant to hominid remains all over the world from Late Acheulian and Middle Palaeolithic sites suggest succession and contemporaneity of cultures there, enhancing the importance of Hathnora as a rare precious site. In this context, the maximum likelihood classification using digital interpretation techniques was carried out for this study area using the satellite imagery from Landsat ETM+ for the year 2006 and Landsat TM (OLI and TIRS) for the year 2016. The overall accuracy of Land Use Land Cover (LULC) classification of 2016 imagery was around 77.27% based on ground truth data. The significant reduction in the main river course and agricultural activities and increase in the built-up area observed in remote sensing data analysis are undoubtedly the outcome of human encroachments in the vicinity of the eminent heritage site.Keywords: cultural succession, digital interpretation, Hathnora, Homo Sapiens, Late Acheulian, Middle Palaeolithic
Procedia PDF Downloads 1705433 Examining Statistical Monitoring Approach against Traditional Monitoring Techniques in Detecting Data Anomalies during Conduct of Clinical Trials
Authors: Sheikh Omar Sillah
Abstract:
Introduction: Monitoring is an important means of ensuring the smooth implementation and quality of clinical trials. For many years, traditional site monitoring approaches have been critical in detecting data errors but not optimal in identifying fabricated and implanted data as well as non-random data distributions that may significantly invalidate study results. The objective of this paper was to provide recommendations based on best statistical monitoring practices for detecting data-integrity issues suggestive of fabrication and implantation early in the study conduct to allow implementation of meaningful corrective and preventive actions. Methodology: Electronic bibliographic databases (Medline, Embase, PubMed, Scopus, and Web of Science) were used for the literature search, and both qualitative and quantitative studies were sought. Search results were uploaded into Eppi-Reviewer Software, and only publications written in the English language from 2012 were included in the review. Gray literature not considered to present reproducible methods was excluded. Results: A total of 18 peer-reviewed publications were included in the review. The publications demonstrated that traditional site monitoring techniques are not efficient in detecting data anomalies. By specifying project-specific parameters such as laboratory reference range values, visit schedules, etc., with appropriate interactive data monitoring, statistical monitoring can offer early signals of data anomalies to study teams. The review further revealed that statistical monitoring is useful to identify unusual data patterns that might be revealing issues that could impact data integrity or may potentially impact study participants' safety. However, subjective measures may not be good candidates for statistical monitoring. Conclusion: The statistical monitoring approach requires a combination of education, training, and experience sufficient to implement its principles in detecting data anomalies for the statistical aspects of a clinical trial.Keywords: statistical monitoring, data anomalies, clinical trials, traditional monitoring
Procedia PDF Downloads 735432 Aerial Photogrammetry-Based Techniques to Rebuild the 30-Years Landform Changes of a Landslide-Dominated Watershed in Taiwan
Authors: Yichin Chen
Abstract:
Taiwan is an island characterized by an active tectonics and high erosion rates. Monitoring the dynamic landscape of Taiwan is an important issue for disaster mitigation, geomorphological research, and watershed management. Long-term and high spatiotemporal landform data is essential for quantifying and simulating the geomorphological processes and developing warning systems. Recently, the advances in unmanned aerial vehicle (UAV) and computational photogrammetry technology have provided an effective way to rebuild and monitor the topography changes in high spatio-temporal resolutions. This study rebuilds the 30-years landform change in the Aiyuzi watershed in 1986-2017 by using the aerial photogrammetry-based techniques. The Aiyuzi watershed, located in central Taiwan and has an area of 3.99 Km², is famous for its frequent landslide and debris flow disasters. This study took the aerial photos by using UAV and collected multi-temporal historical, stereo photographs, taken by the Aerial Survey Office of Taiwan’s Forestry Bureau. To rebuild the orthoimages and digital surface models (DSMs), Pix4DMapper, a photogrammetry software, was used. Furthermore, to control model accuracy, a set of ground control points was surveyed by using eGPS. The results show that the generated DSMs have the ground sampling distance (GSD) of ~10 cm and ~0.3 cm from the UAV’s and historical photographs, respectively, and vertical error of ~1 m. By comparing the DSMs, there are many deep-seated landslides (with depth over 20 m) occurred on the upstream in the Aiyuzi watershed. Even though a large amount of sediment is delivered from the landslides, the steep main channel has sufficient capacity to transport sediment from the channel and to erode the river bed to ~20 m in depth. Most sediments are transported to the outlet of watershed and deposits on the downstream channel. This case study shows that UAV and photogrammetry technology are useful for topography change monitoring effectively.Keywords: aerial photogrammetry, landslide, landform change, Taiwan
Procedia PDF Downloads 1555431 Comparison of Regional and Local Indwelling Catheter Techniques to Prolong Analgesia in Total Knee Arthroplasty Procedures: Continuous Peripheral Nerve Block and Continuous Periarticular Infiltration
Authors: Jared Cheves, Amanda DeChent, Joyce Pan
Abstract:
Total knee replacements (TKAs) are one of the most common but painful surgical procedures performed in the United States. Currently, the gold standard for postoperative pain management is the utilization of opioids. However, in the wake of the opioid epidemic, the healthcare system is attempting to reduce opioid consumption by trialing innovative opioid sparing analgesic techniques such as continuous peripheral nerve blocks (CPNB) and continuous periarticular infiltration (CPAI). The alleviation of pain, particularly during the first 72 hours postoperatively, is of utmost importance due to its association with delayed recovery, impaired rehabilitation, immunosuppression, the development of chronic pain, the development of rebound pain, and decreased patient satisfaction. While both CPNB and CPAI are being used today, there is limited evidence comparing the two to the current standard of care or to each other. An extensive literature review was performed to explore the safety profiles and effectiveness of CPNB and CPAI in reducing reported pain scores and decreasing opioid consumption. The literature revealed the usage of CPNB contributed to lower pain scores and decreased opioid use when compared to opioid-only control groups. Additionally, CPAI did not improve pain scores or decrease opioid consumption when combined with a multimodal analgesic (MMA) regimen. When comparing CPNB and CPAI to each other, neither unanimously lowered pain scores to a greater degree, but the literature indicates that CPNB decreased opioid consumption more than CPAI. More research is needed to further cement the efficacy of CPNB and CPAI as standard components of MMA in TKA procedures. In addition, future research can also focus on novel catheter-free applications to reduce the complications of continuous catheter analgesics.Keywords: total knee arthroplasty, continuous peripheral nerve blocks, continuous periarticular infiltration, opioid, multimodal analgesia
Procedia PDF Downloads 955430 Long-Term Results of Coronary Bifurcation Stenting with Drug Eluting Stents
Authors: Piotr Muzyk, Beata Morawiec, Mariusz Opara, Andrzej Tomasik, Brygida Przywara-Chowaniec, Wojciech Jachec, Ewa Nowalany-Kozielska, Damian Kawecki
Abstract:
Background: Coronary bifurcation is one of the most complex lesion in patients with coronary ar-tery disease. Provisional T-stenting is currently one of the recommended techniques. The aim was to assess optimal methods of treatment in the era of drug-eluting stents (DES). Methods: The regis-try consisted of data from 1916 patients treated with coronary percutaneous interventions (PCI) using either first- or second-generation DES. Patients with bifurcation lesion entered the analysis. Major adverse cardiac and cardiovascular events (MACCE) were assessed at one year of follow-up and comprised of death, acute myocardial infarction (AMI), repeated PCI (re-PCI) of target ves-sel and stroke. Results: Of 1916 registry patients, 204 patients (11%) were diagnosed with bifurcation lesion >50% and entered the analysis. The most commonly used technique was provi-sional T-stenting (141 patients, 69%). Optimization with kissing-balloons technique was performed in 45 patients (22%). In 59 patients (29%) second-generation DES was implanted, while in 112 pa-tients (55%), first-generation DES was used. In 33 patients (16%) both types of DES were used. The procedure success rate (TIMI 3 flow) was achieved in 98% of patients. In one-year follow-up, there were 39 MACCE (19%) (9 deaths, 17 AMI, 16 re-PCI and 5 strokes). Provisional T-stenting resulted in similar rate of MACCE to other techniques (16% vs. 5%, p=0.27) and similar occurrence of re-PCI (6% vs. 2%, p=0.78). The results of post-PCI kissing-balloon technique gave equal out-comes with 3% vs. 16% of MACCE in patients in whom no optimization technique was used (p=0.39). The type of implanted DES (second- vs. first-generation) had no influence on MACCE (4% vs 14%, respectively, p=0.12) and re-PCI (1.7% vs. 51% patients, respectively, p=0.28). Con-clusions: The treatment of bifurcation lesions with PCI represent high-risk procedures with high rate of MACCE. Stenting technique, optimization of PCI and the generation of implanted stent should be personalized for each case to balance risk of the procedure. In this setting, the operator experience might be the factor of better outcome, which should be further investigated.Keywords: coronary bifurcation, drug eluting stents, long-term follow-up, percutaneous coronary interventions
Procedia PDF Downloads 2025429 Comparison of Inexpensive Cell Disruption Techniques for an Oleaginous Yeast
Authors: Scott Nielsen, Luca Longanesi, Chris Chuck
Abstract:
Palm oil is obtained from the flesh and kernel of the fruit of oil palms and is the most productive and inexpensive oil crop. The global demand for palm oil is approximately 75 million metric tonnes, a 29% increase in global production of palm oil since 2016. This expansion of oil palm cultivation has resulted in mass deforestation, vast biodiversity destruction and increasing net greenhouse gas emissions. One possible alternative is to produce a saturated oil, similar to palm, from microbes such as oleaginous yeast. The yeasts can be cultured on sugars derived from second-generation sources and do not compete with tropical forests for land. One highly promising oleaginous yeast for this application is Metschnikowia pulcherrima. However, recent techno-economic modeling has shown that cell lysis and standard lipid extraction are major contributors to the cost of the oil. Typical cell disruption techniques to extract either single cell oils or proteins have been based around bead-beating, homogenization and acid lysis. However, these can have a detrimental effect on lipid quality and are energy-intensive. In this study, a vortex separator, which produces high sheer with minimal energy input, was investigated as a potential low energy method of lysing cells. This was compared to four more traditional methods (thermal lysis, acid lysis, alkaline lysis, and osmotic lysis). For each method, the yeast loading was also examined at 1 g/L, 10 g/L and 100 g/L. The quality of the cell disruption was measured by optical cell density, cell counting and the particle size distribution profile comparison over a 2-hour period. This study demonstrates that the vortex separator is highly effective at lysing the cells and could potentially be used as a simple apparatus for lipid recovery in an oleaginous yeast process. The further development of this technology could potentially reduce the overall cost of microbial lipids in the future.Keywords: palm oil substitute, metschnikowia pulcherrima, cell disruption, cell lysis
Procedia PDF Downloads 2035428 The Strategy for Detection of Catecholamines in Body Fluids: Optical Sensor
Authors: Joanna Cabaj, Sylwia Baluta, Karol Malecha, Kamila Drzozga
Abstract:
Catecholamines are the principal neurotransmitters that mediate a variety of the central nervous system functions, such as motor control, cognition, emotion, memory processing, and endocrine modulation. Dysfunctions in catecholamine neurotransmission are induced in some neurologic and neuropsychiatric diseases. Changeable neurotransmitters level in biological fluids can be a marker of several neurological disorders. Because of its significance in analytical techniques and diagnostics, sensitive and selective detection of neurotransmitters is increasingly attracting a lot of attention in different areas of bio-analysis or biomedical research. Recently, fluorescent techniques for detection of catecholamines have attracted interests due to their reasonable cost, convenient control, as well as maneuverability in biological environments. Nevertheless, with the observed need for a sensitive and selective catecholamines sensor, the development of a convenient method for this neurotransmitter is still at its basic level. The manipulation of nanostructured materials in conjunction with biological molecules has led to the development of a new class of hybrid modified biosensors in which both enhancement of charge transport and biological activity preservation may be obtained. Immobilization of biomaterials on electrode surfaces is the crucial step in fabricating electrochemical as well as optical biosensors and bioelectronic devices. Continuing systematic investigation in the manufacturing of enzyme–conducting sensitive systems, here is presented a convenient fluorescence sensing strategy for catecholamines detection based on FRET (fluorescence resonance energy transfer) phenomena observed for, i.e., complexes of Fe²⁺ and epinephrine. The biosensor was constructed using low temperature co-fired ceramics technology (LTCC). This sensing system used the catalytical oxidation of catecholamines and quench of the strong luminescence of obtained complexes due to FRET. The detection process was based on the oxidation of substrate in the presence of the enzyme–laccase/tyrosinase.Keywords: biosensor, conducting polymer, enzyme, FRET, LTCC
Procedia PDF Downloads 2565427 A Comparative Analysis of Clustering Approaches for Understanding Patterns in Health Insurance Uptake: Evidence from Sociodemographic Kenyan Data
Authors: Nelson Kimeli Kemboi Yego, Juma Kasozi, Joseph Nkruzinza, Francis Kipkogei
Abstract:
The study investigated the low uptake of health insurance in Kenya despite efforts to achieve universal health coverage through various health insurance schemes. Unsupervised machine learning techniques were employed to identify patterns in health insurance uptake based on sociodemographic factors among Kenyan households. The aim was to identify key demographic groups that are underinsured and to provide insights for the development of effective policies and outreach programs. Using the 2021 FinAccess Survey, the study clustered Kenyan households based on their health insurance uptake and sociodemographic features to reveal patterns in health insurance uptake across the country. The effectiveness of k-prototypes clustering, hierarchical clustering, and agglomerative hierarchical clustering in clustering based on sociodemographic factors was compared. The k-prototypes approach was found to be the most effective at uncovering distinct and well-separated clusters in the Kenyan sociodemographic data related to health insurance uptake based on silhouette, Calinski-Harabasz, Davies-Bouldin, and Rand indices. Hence, it was utilized in uncovering the patterns in uptake. The results of the analysis indicate that inclusivity in health insurance is greatly related to affordability. The findings suggest that targeted policy interventions and outreach programs are necessary to increase health insurance uptake in Kenya, with the ultimate goal of achieving universal health coverage. The study provides important insights for policymakers and stakeholders in the health insurance sector to address the low uptake of health insurance and to ensure that healthcare services are accessible and affordable to all Kenyans, regardless of their socio-demographic status. The study highlights the potential of unsupervised machine learning techniques to provide insights into complex health policy issues and improve decision-making in the health sector.Keywords: health insurance, unsupervised learning, clustering algorithms, machine learning
Procedia PDF Downloads 1385426 Genetically Modified Organisms
Authors: Mudrika Singhal
Abstract:
The research paper is basically about how the genetically modified organisms evolved and their significance in today’s world. It also highlights about the various pros and cons of the genetically modified organisms and the progress of India in this field. A genetically modified organism is the one whose genetic material has been altered using genetic engineering techniques. They have a wide range of uses such as transgenic plants, genetically modified mammals such as mouse and also in insects and aquatic life. Their use is rooted back to the time around 12,000 B.C. when humans domesticated plants and animals. At that humans used genetically modified organisms produced by the procedure of selective breeding and not by genetic engineering techniques. Selective breeding is the procedure in which selective traits are bred in plants and animals and then are domesticated. Domestication of wild plants into a suitable cultigen is a well known example of this technique. GMOs have uses in varied fields ranging from biological and medical research, production of pharmaceutical drugs to agricultural fields. The first organisms to be genetically modified were the microbes because of their simpler genetics. At present the genetically modified protein insulin is used to treat diabetes. In the case of plants transgenic plants, genetically modified crops and cisgenic plants are the examples of genetic modification. In the case of mammals, transgenic animals such as mice, rats etc. serve various purposes such as researching human diseases, improvement in animal health etc. Now coming upon the pros and cons related to the genetically modified organisms, pros include crops with higher yield, less growth time and more predictable in comparison to traditional breeding. Cons include that they are dangerous to mammals such as rats, these products contain protein which would trigger allergic reactions. In India presently, group of GMOs include GM microorganisms, transgenic crops and animals. There are varied applications in the field of healthcare and agriculture. In the nutshell, the research paper is about the progress in the field of genetic modification, taking along the effects in today’s world.Keywords: applications, mammals, transgenic, engineering and technology
Procedia PDF Downloads 5955425 Rapid Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, complexity, parallelism
Procedia PDF Downloads 5375424 Advanced Stability Criterion for Time-Delayed Systems of Neutral Type and Its Application
Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon
Abstract:
This paper investigates stability problem for linear systems of neutral type with time-varying delay. By constructing various Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient stability conditions for the systems are established in terms of linear matrix inequalities (LMIs), which can be easily solved by various effective optimization algorithms. Finally, some illustrative examples are given to show the effectiveness of the proposed criterion.Keywords: neutral systems, time-delay, stability, Lyapnov method, LMI
Procedia PDF Downloads 3465423 Effects of Plyometric Exercises on Agility, Power and Speed Improvement of U-17 Female Sprinters in Case of Burayu Athletics Project, Oromia, Ethiopia
Authors: Abdeta Bayissa Mekessa
Abstract:
The purpose of this study was to examine the effects of plyometric exercises on agility, power, and speed and improvement of U-17 female sprinters in the case of the Burayu Athletics project. The true experimental research design was employed for conducting this study. The total populations of the study were 14 U-17 female sprinters from Burayu athletics project. The populations were small in numbers; therefore, the researcher took all as a sample by using comprehensive sampling techniques. These subjects were classified into the Experimental group (N=7) and the Control group (N=7) by using simple random sampling techniques. The Experimental group participated in plyometric training for 8 weeks, 3 days per week and 60 minutes duration per day in addition to their regular training. But, the control groups were following their only regular training program. The variables selected for the purpose of this study were agility, power and speed. The tests were the Illinois agility test, standing long jump test, and 30m sprint test, respectively. Both groups were tested before (pre-test) and after (post-test) 8 weeks of plyometric training. For data analysis, the researcher used SPSS version 26.0 software. The collected data was analyzed using a paired sample t-test to observe the difference between the pre-test and post-test results of the plyometric exercises of the study. The significant level of p<0.05 was considered. The result of the study shows that after 8 weeks of plyometric training, significant improvements were found in Agility (MD=0.45, p<0.05), power (MD=-1.157, P<0.05) and speed (MD=0.37, P<0.05) for experimental group subjects. On the other hand, there was no significant change (P>0.05) in those variables in the control groups. Finally, the findings of the study showed that eight (8) weeks of plyometric exercises had a positive effect on agility, power and speed improvement of female sprinters. Therefore, Athletics coaches and athletes are highly recommended to include plyometric exercise in their training program.Keywords: ploymetric exercise, speed power, aglity, female sprinter
Procedia PDF Downloads 375422 Basic Evaluation for Polyetherimide Membrane Using Spectroscopy Techniques
Authors: Hanan Alenezi
Abstract:
Membrane performance depends on the kind of solvent used in preparation. A membrane made by Polyetherimide (PEI) was evaluated for gas separation using X-Ray Diffraction (XRD), Scanning electron microscope (SEM), and Energy Dispersive X-Ray Spectroscopy (EDS). The purity and the thickness are detected to evaluate the membrane in order to optimize PEI membrane preparation.Keywords: Energy Dispersive X-Ray Spectroscopy (EDS), Membrane, Polyetherimide PEI, Scanning electron microscope (SEM), Solvent, X-Ray Diffraction (XRD)
Procedia PDF Downloads 1805421 Pedagogical Tools In The 21st Century
Authors: M. Aherrahrou
Abstract:
Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning
Procedia PDF Downloads 3545420 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 195419 A Review on Investigating the Relations between Water Harvesting and Water Conflicts
Authors: B. Laurita
Abstract:
The importance of Water Harvesting (WH) as an effective mean to deal with water scarcity is universally recognized. The collection and storage of rainwater, floodwater or quick runoff and their conversion to productive uses can ensure water availability for domestic and agricultural use, enabling a lower exploitation of the aquifer, preventing erosion events and providing significant ecosystem services. At the same time, it has been proven that it can reduce the insurgence of water conflicts if supported by a cooperative process of planning and management. On the other hand, the construction of water harvesting structures changes the hydrological regime, affecting upstream-downstream dynamics and changing water allocation, often causing contentions. Furthermore, dynamics existing between water harvesting and water conflict are not properly investigated yet. Thus, objective of this study is to analyze the relations between water harvesting and the insurgence of water conflicts, providing a solid theoretical basis and foundations for future studies. Two search engines were selected in order to perform the study: Google Scholar and Scopus. Separate researches were conducted on the mutual influences between water conflicts and the four main water harvesting techniques: rooftop harvesting, surface harvesting, underground harvesting, runoff harvesting. Some of the aforementioned water harvesting techniques have been developed and implemented on scales ranging from the small, household-sided ones, to gargantuan dam systems. Instead of focusing on the collisions related to large-scale systems, this review is aimed to look for and collect examples of the effects that the implementation of small water harvesting systems has had on the access to the water resource and on water governance. The present research allowed to highlight that in the studies that have been conducted up to now, water harvesting, and in particular those structures that allow the collection and storage of water for domestic use, is usually recognized as a positive, palliative element during contentions. On the other hand, water harvesting can worsen and, in some cases, even generate conflicts for water management. This shows the necessity of studies that consider both benefits and negative influences of water harvesting, analyzing its role respectively as triggering or as mitigating factor of conflicting situations.Keywords: arid areas, governance, water conflicts, water harvesting
Procedia PDF Downloads 2025418 Experimental Modeling of Spray and Water Sheet Formation Due to Wave Interactions with Vertical and Slant Bow-Shaped Model
Authors: Armin Bodaghkhani, Bruce Colbourne, Yuri S. Muzychka
Abstract:
The process of spray-cloud formation and flow kinematics produced from breaking wave impact on vertical and slant lab-scale bow-shaped models were experimentally investigated. Bubble Image Velocimetry (BIV) and Image Processing (IP) techniques were applied to study the various types of wave-model impacts. Different wave characteristics were generated in a tow tank to investigate the effects of wave characteristics, such as wave phase velocity, wave steepness on droplet velocities, and behavior of the process of spray cloud formation. The phase ensemble-averaged vertical velocity and turbulent intensity were computed. A high-speed camera and diffused LED backlights were utilized to capture images for further post processing. Various pressure sensors and capacitive wave probes were used to measure the wave impact pressure and the free surface profile at different locations of the model and wave-tank, respectively. Droplet sizes and velocities were measured using BIV and IP techniques to trace bubbles and droplets in order to measure their velocities and sizes by correlating the texture in these images. The impact pressure and droplet size distributions were compared to several previously experimental models, and satisfactory agreements were achieved. The distribution of droplets in front of both models are demonstrated. Due to the highly transient process of spray formation, the drag coefficient for several stages of this transient displacement for various droplet size ranges and different Reynolds number were calculated based on the ensemble average method. From the experimental results, the slant model produces less spray in comparison with the vertical model, and the droplet velocities generated from the wave impact with the slant model have a lower velocity as compared with the vertical model.Keywords: spray charachteristics, droplet size and velocity, wave-body interactions, bubble image velocimetry, image processing
Procedia PDF Downloads 2995417 Ancient Iran Water Technologies
Authors: Akbar Khodavirdizadeh, Ali Nemati Babaylou, Hassan Moomivand
Abstract:
The history of human access to water technique has been one of the factors in the formation of human civilizations in the ancient world. The technique that makes surface water and groundwater accessible to humans on the ground has been a clever technique in human life to reach the water. In this study, while examining the water technique of ancient Iran using the Qanats technique, the water supply system of different regions of the ancient world were also studied and compared. Six groups of the ancient region of ancient Greece (Archaic 480-750 BC and Classical 223-480 BC), Urartu in Tuspa (600-850 BC), Petra (106-168 BC), Ancient Rome (265 BC), and the ancient United States (1450 BC) and ancient Iranian water technologies were studied under water supply systems. Past water technologies in these areas: water transmission systems in primary urban centers, use of water structures in water control, use of bridges in water transfer, construction of waterways for water transfer, storage of rainfall, construction of various types of pottery- ceramic, lead, wood and stone pipes have been used in water transfer, flood control, water reservoirs, dams, channel, wells, and Qanat. The central plateau of Iran is one of the arid and desert regions. Archaeological, geomorphological, and paleontological studies of the central region of the Iranian plateau showed that without the use of Qanats, the possibility of urban civilization in this region was difficult and even impossible. Zarch aqueduct is the most important aqueduct in Yazd region. Qanat of Zarch is a plain Qanat with a gallery length of 80 km; its mother well is 85 m deep and has 2115 well shafts. The main purpose of building the Qanat of Zārch was to access the groundwater source and transfer it to the surface of the ground. Regarding the structure of the aqueduct and the technique of transferring water from the groundwater source to the surface, it has a great impact on being different from other water techniques in the ancient world. The results show that the use of water technologies in ancient is very important to understand the history of humanity in the use of hydraulic techniques.Keywords: ancient water technologies, groundwaters, qanat, human history, Ancient Iran
Procedia PDF Downloads 1095416 Brain-Computer Interfaces That Use Electroencephalography
Authors: Arda Ozkurt, Ozlem Bozkurt
Abstract:
Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.Keywords: BCI, EEG, non-invasive, spatial resolution
Procedia PDF Downloads 71