Search results for: GIS techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6641

Search results for: GIS techniques

5381 Formation of in-situ Ceramic Phase in N220 Nano Carbon Containing Low Carbon Mgo-C Refractory

Authors: Satyananda Behera, Ritwik Sarkar

Abstract:

In iron and steel industries, MgO–C refractories are widely used in basic oxygen furnaces, electric arc furnaces and steel ladles due to their excellent corrosion resistance, thermal shock resistance, and other excellent hot properties. Conventionally magnesia carbon refractories contain about 8-20 wt% of carbon but the use of carbon is also associate with disadvantages like oxidation, low fracture strength, high heat loss and higher carbon pick up in steel. So, MgO-C refractory having low carbon content without compromising the beneficial properties is the challenge. Nano carbon, having finer particles, can mix and distribute within the entire matrix uniformly and can result in improved mechanical, thermo-mechanical, corrosion and other refractory properties. Previous experiences with the use of nano carbon in low carbon MgO-C refractory have indicated an optimum range of use of nano carbon around 1 wt%. This optimum nano carbon content was used in MgO-C compositions with flaky graphite followed by aluminum and silicon metal powder as an anti-oxidant. These low carbon MgO-C refractory compositions were prepared by conventional manufacturing techniques. At the same time 16 wt. % flaky graphite containing conventional MgO-C refractory was also prepared parallel under similar conditions. The developed products were characterized for various refractory related properties. Nano carbon containing compositions showed better mechanical, thermo-mechanical properties, and oxidation resistance compared to that of conventional composition. Improvement in the properties is associated with the formation of in-situ ceramic phase-like aluminum carbide, silicon carbide, and magnesium aluminum spinel. Higher surface area and higher reactivity of N220 nano carbon black resulted in greater formation in-situ ceramic phases, even at a much lower amount. Nano carbon containing compositions were found to have improved properties in MgO-C refractories compared to that of the conventional ones at much lower total carbon content.

Keywords: N220nano carbon black, refractory properties, conventionally manufacturing techniques, conventional magnesia carbon refractories

Procedia PDF Downloads 360
5380 A Levinasian Perspective on the Field of Applied Ethics

Authors: Payman Tajalli, Steven Segal

Abstract:

Applied ethics is an area of ethics which is looked upon most favorably as the most appropriate and useful for educational purposes; after all if ethics finds no application would any investment of time, effort and finance by the educational institutions be warranted? The current approaches to ethics in business and management often entail appealing to various types of moral theories and to this end almost every major philosophical approach has been enlisted. In this paper, we look at ethics through the philosophy of Emmanuel Levinas to argue that since ethics is ‘first philosophy’ it can neither be rule-based nor rule-governed, not something that can be worked out first and then applied to a given situation, hence the overwhelming emphasis on ‘applied ethics’ as a field of study in business and management education is unjustified. True ethics is not applied ethics. This assertion does not mean that teaching ethical theories and philosophies need to be abandoned rather it is the acceptance of the fact that an increase in cognitive awareness of such theories and ethical models and frameworks, or the mastering of techniques and procedures for ethical decision making, will not affect the desired ethical transformation in our students. Levinas himself argued for an ethics without a foundation, not one that required us to go ‘beyond good and evil’ as Nietzsche contended, rather an ethics which necessitates going ‘before good and evil'. Such an ethics does not provide us with a set of methods or techniques or a decision tree that enable us determine the rightness of an action and what we ought to do, rather it is about a way of being, an ethical posture or approach one takes in the inter-subjective relationship with the other that holds the promise of ethical conduct. Ethics in this Levinasian sense then is one of infinite and unconditional responsibility for the other person in relationship, an ethics which is not subject to negotiation, calculation or reciprocity, and as such it could neither be applied nor taught through conventional pedagogy with its focus on knowledge transfer from the teacher to student, and to this end Levinas offers a non-maieutic, non-conventional approach to pedagogy. The paper concludes that from a Levinasian perspective on ethics and education, we may need to guide our students to move away from the clear and objective professionalism of the management and applied ethics towards the murky individual spiritualism. For Levinas, this is ‘the Copernican revolution’ in ethics.

Keywords: business ethics, ethics education, Levinas, maieutic teaching, ethics without foundation

Procedia PDF Downloads 317
5379 Methods and Techniques for Lower Danube Sturgeon Monitoring Used for the Assessment of Anthropic Activities Pressures and the Quantification of Risks on These Species

Authors: Gyorgy Deak, Marius C. Raischi, Lucian P. Georgescu, Tiberius M. Danalache, Elena Holban, Madalina G. Boboc, Monica Matei, Catalina Iticescu, Marius V. Olteanu, Stefan Zamfir, Gabriel Cornateanu

Abstract:

At present, on the Lower Danube, different types of pressures have been identified that affect the anadromous sturgeons stocks with an impact that leads to their decline. This paper presents techniques and procedures used by Romanian experts in the tagging and monitoring of anadromous sturgeons, as well as unique results at international level obtained on the basis of an informational volume collected in over 7 years of monitoring on these species behavior (both for adults as well as for ultrasonically tagged juveniles) on the Lower Danube. The local impact of hydrotechnical constructions (bottom sill, maritime navigation channel), the global impact of the poaching phenomenon and the impact of the restocking programs with sturgeon juveniles were assessed. Thus, the bottom sill impact on the Bala branch, the Bastroe Channel (cross-border impact) and the poaching phenomenon at the level of the Lower Danube was analyzed on the basis of a unique informational volume obtained through the use of patented monitoring systems by the Romanian experts (DKTB respectively, DKMR-01T). At the same time, the results from the monitoring of ultrasonically tagged sturgeon juveniles from the 2015 repopulation program are presented. Conclusions resulting from research can ensure favorable premises for finding some conservation solutions for CITES-protected sturgeon species that have survived for millions of years, currently being 1 species on the brink of extinction - Russian sturgeon, 2 species in danger of extinction - Beluga sturgeon and Stellate sturgeon and 2 species already extinct from the Lower Danube, namely common sturgeon and ship sturgeon.

Keywords: Lower Danube, sturgeons monitoring (adults and juveniles), tagging, impact on conservation

Procedia PDF Downloads 236
5378 Comprehensive Profiling and Characterization of Untargeted Extracellular Metabolites in Fermentation Processes: Insights and Advances in Analysis and Identification

Authors: Marianna Ciaccia, Gennaro Agrimi, Isabella Pisano, Maurizio Bettiga, Silvia Rapacioli, Giulia Mensa, Monica Marzagalli

Abstract:

Objective: Untargeted metabolomic analysis of extracellular metabolites is a powerful approach that focuses on comprehensively profiling in the extracellular space. In this study, we applied extracellular metabolomic analysis to investigate the metabolism of two probiotic microorganisms with health benefits that extend far beyond the digestive tract and the immune system. Methods: Analytical techniques employed in extracellular metabolomic analysis encompass various technologies, including mass spectrometry (MS), which enables the identification of metabolites present in the fermentation media, as well as the comparison of metabolic profiles under different experimental conditions. Multivariate statistical analysis techniques like principal component analysis (PCA) or partial least squares-discriminant analysis (PLS-DA) play a crucial role in uncovering metabolic signatures and understanding the dynamics of metabolic networks. Results: Different types of supernatants from fermentation processes, such as dairy-free, not dairy-free media and media with no cells or pasteurized, were subjected to metabolite profiling, which contained a complex mixture of metabolites, including substrates, intermediates, and end-products. This profiling provided insights into the metabolic activity of the microorganisms. The integration of advanced software tools has facilitated the identification and characterization of metabolites in different fermentation conditions and microorganism strains. Conclusions: In conclusion, untargeted extracellular metabolomic analysis, combined with software tools, allowed the study of the metabolites consumed and produced during the fermentation processes of probiotic microorganisms. Ongoing advancements in data analysis methods will further enhance the application of extracellular metabolomic analysis in fermentation research, leading to improved bioproduction and the advancement of sustainable manufacturing processes.

Keywords: biotechnology, metabolomics, lactic bacteria, probiotics, postbiotics

Procedia PDF Downloads 65
5377 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data

Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill

Abstract:

Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.

Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function

Procedia PDF Downloads 273
5376 A Distributed Mobile Agent Based on Intrusion Detection System for MANET

Authors: Maad Kamal Al-Anni

Abstract:

This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the  signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness  for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).

Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)

Procedia PDF Downloads 189
5375 Experimental Study of Reflective Roof as a Passive Cooling Method in Homes Under the Paradigm of Appropriate Technology

Authors: Javier Ascanio Villabona, Brayan Eduardo Tarazona Romero, Camilo Leonardo Sandoval Rodriguez, Arly Dario Rincon, Omar Lengerke Perez

Abstract:

Efficient energy consumption in the housing sector in relation to refrigeration is a concern in the construction and rehabilitation of houses in tropical areas. Thermal comfort is aggravated by heat gain on the roof surface by heat gains. Thus, in the group of passive cooling techniques, one of the practices and technologies in solar control that provide improvements in comfortable conditions are thermal insulation or geometric changes of the roofs. On the other hand, methods with reflection and radiation are the methods used to decrease heat gain by facilitating the removal of excess heat inside a building to maintain a comfortable environment. Since the potential of these techniques varies in different climatic zones, their application in different zones should be examined. This research is based on the experimental study of a prototype of a roof radiator as a method of passive cooling in homes, which was developed through an experimental research methodology making measurements in a prototype built by means of the paradigm of appropriate technology, with the aim of establishing an initial behavior of the internal temperature resulting from the climate of the external environment. As a starting point, a selection matrix was made to identify the typologies of passive cooling systems to model the system and its subsequent implementation, establishing its constructive characteristics. Step followed by the measurement of the climatic variables (outside the prototype) and microclimatic variables (inside the prototype) to obtain a database to be analyzed. As a final result, the decrease in temperature that occurs inside the chamber with respect to the outside temperature was evidenced. likewise, a linearity in its behavior in relation to the variations of the climatic variables.

Keywords: appropriate technology, enveloping, energy efficiency, passive cooling

Procedia PDF Downloads 88
5374 Research on Design Methods for Riverside Spaces of Deep-cut Rivers in Mountainous Cities: A Case Study of Qingshuixi River in Chongqing City

Authors: Luojie Tang

Abstract:

Riverside space is an important public space and ecological corridor in urban areas, but mountainous urban rivers are often overlooked due to their deep valleys and poor accessibility. This article takes the Qing Shui Xi River in Chongqing as an example, and through long-term field inspections, measurements, interviews, and online surveys, summarizes the problems of poor accessibility, limited space for renovation, lack of waterfront facilities, excessive artificial intervention, low average runoff, severe river water pollution, and difficulty in integrated watershed management in riverside space. Based on the current situation and drawing on relevant experiences, this article summarizes the design methods for riverside space in deep valley rivers in mountainous urban areas. Regarding spatial design techniques, the article emphasizes the importance of integrating waterfront spaces into the urban public space system and vertical linkages. Furthermore, the article suggests different design methods and improvement strategies for the already developed areas and new development areas. Specifically, the article proposes a planning and design strategy of "protection" and "empowerment" for new development areas and an updating and transformation strategy of "improvement" and "revitalization" for already developed areas. In terms of ecological restoration methods, the article suggests three focus points: increasing the runoff of urban rivers, raising the landscape water level during dry seasons, and restoring vegetation and wetlands in the riverbank buffer zone while protecting the overall pattern of the watershed. Additionally, the article presents specific design details of the Qingshuixi River to illustrate the proposed design and restoration techniques.

Keywords: deep-cut river, design method, mountainous city, Qingshuixi river in Chongqing, waterfront space design

Procedia PDF Downloads 101
5373 Intrusion Detection in SCADA Systems

Authors: Leandros A. Maglaras, Jianmin Jiang

Abstract:

The protection of the national infrastructures from cyberattacks is one of the main issues for national and international security. The funded European Framework-7 (FP7) research project CockpitCI introduces intelligent intrusion detection, analysis and protection techniques for Critical Infrastructures (CI). The paradox is that CIs massively rely on the newest interconnected and vulnerable Information and Communication Technology (ICT), whilst the control equipment, legacy software/hardware, is typically old. Such a combination of factors may lead to very dangerous situations, exposing systems to a wide variety of attacks. To overcome such threats, the CockpitCI project combines machine learning techniques with ICT technologies to produce advanced intrusion detection, analysis and reaction tools to provide intelligence to field equipment. This will allow the field equipment to perform local decisions in order to self-identify and self-react to abnormal situations introduced by cyberattacks. In this paper, an intrusion detection module capable of detecting malicious network traffic in a Supervisory Control and Data Acquisition (SCADA) system is presented. Malicious data in a SCADA system disrupt its correct functioning and tamper with its normal operation. OCSVM is an intrusion detection mechanism that does not need any labeled data for training or any information about the kind of anomaly is expecting for the detection process. This feature makes it ideal for processing SCADA environment data and automates SCADA performance monitoring. The OCSVM module developed is trained by network traces off line and detects anomalies in the system real time. The module is part of an IDS (intrusion detection system) developed under CockpitCI project and communicates with the other parts of the system by the exchange of IDMEF messages that carry information about the source of the incident, the time and a classification of the alarm.

Keywords: cyber-security, SCADA systems, OCSVM, intrusion detection

Procedia PDF Downloads 544
5372 Deep Brain Stimulation and Motor Cortex Stimulation for Post-Stroke Pain: A Systematic Review and Meta-Analysis

Authors: Siddarth Kannan

Abstract:

Objectives: Deep Brain Stimulation (DBS) and Motor Cortex stimulation (MCS) are innovative interventions in order to treat various neuropathic pain disorders such as post-stroke pain. While each treatment has a varying degree of success in managing pain, comparative analysis has not yet been performed, and the success rates of these techniques using validated, objective pain scores have not been synthesised. The aim of this study was to compare the effect of pain relief offered by MCS and DBS on patients with post-stroke pain and to assess if either of these procedures offered better results. Methods: A systematic review and meta-analysis were conducted in accordance with PRISMA guidelines (PROSPEROID CRD42021277542). Three databases were searched, and articles published from 2000 to June 2023 were included (last search date 25 June 2023). Meta-analysis was performed using random effects models. We evaluated the performance of DBS or MCS by assessing studies that reported pain relief using the Visual Analogue Scale (VAS). Data analysis of descriptive statistics was performed using SPSS (Version 27; IBM; Armonk; NY; USA). R statistics (Rstudio Version 4.0.1) was used to perform meta-analysis. Results: Of the 478 articles identified, 27 were included in the analysis (232 patients- 117 DBS & 115 MCS). The pooled number of patients who improved after DBS was 0.68 (95% CI, 0.57-0.77, I2=36%). The pooled number of patients who improved after MCS was 0.72 (95% CI, 0.62-0.80, I2=59%). Further sensitivity analysis was done to include only studies with a minimum of 5 patients in order to assess if there was any impact on the overall results. Nine studies each for DBS and MCS met these criteria. There seemed to be no significant difference in results. Conclusions: The use of surgical interventions such as DBS and MCS is an upcoming field for the treatment of post-stroke pain, with limited studies exploring and comparing these two techniques. While our study shows that MCS might be a slightly better treatment option, further research would need to be done in order to determine the appropriate surgical intervention for post-stroke pain.

Keywords: post-stroke pain, deep brain stimulation, motor cortex stimulation, pain relief

Procedia PDF Downloads 130
5371 A Supply Chain Risk Management Model Based on Both Qualitative and Quantitative Approaches

Authors: Henry Lau, Dilupa Nakandala, Li Zhao

Abstract:

In today’s business, it is well-recognized that risk is an important factor that needs to be taken into consideration before a decision is made. Studies indicate that both the number of risks faced by organizations and their potential consequences are growing. Supply chain risk management has become one of the major concerns for practitioners and researchers. Supply chain leaders and scholars are now focusing on the importance of managing supply chain risk. In order to meet the challenge of managing and mitigating supply chain risk (SCR), we must first identify the different dimensions of SCR and assess its relevant probability and severity. SCR has been classified in many different ways, and there are no consistently accepted dimensions of SCRs and several different classifications are reported in the literature. Basically, supply chain risks can be classified into two dimensions namely disruption risk and operational risk. Disruption risks are those caused by events such as bankruptcy, natural disasters and terrorist attack. Operational risks are related to supply and demand coordination and uncertainty, such as uncertain demand and uncertain supply. Disruption risks are rare but severe and hard to manage, while operational risk can be reduced through effective SCM activities. Other SCRs include supply risk, process risk, demand risk and technology risk. In fact, the disorganized classification of SCR has created confusion for SCR scholars. Moreover, practitioners need to identify and assess SCR. As such, it is important to have an overarching framework tying all these SCR dimensions together for two reasons. First, it helps researchers use these terms for communication of ideas based on the same concept. Second, a shared understanding of the SCR dimensions will support the researchers to focus on the more important research objective: operationalization of SCR, which is very important for assessing SCR. In general, fresh food supply chain is subject to certain level of risks, such as supply risk (low quality, delivery failure, hot weather etc.) and demand risk (season food imbalance, new competitors). Effective strategies to mitigate fresh food supply chain risk are required to enhance operations. Before implementing effective mitigation strategies, we need to identify the risk sources and evaluate the risk level. However, assessing the supply chain risk is not an easy matter, and existing research mainly use qualitative method, such as risk assessment matrix. To address the relevant issues, this paper aims to analyze the risk factor of the fresh food supply chain using an approach comprising both fuzzy logic and hierarchical holographic modeling techniques. This novel approach is able to take advantage the benefits of both of these well-known techniques and at the same time offset their drawbacks in certain aspects. In order to develop this integrated approach, substantial research work is needed to effectively combine these two techniques in a seamless way, To validate the proposed integrated approach, a case study in a fresh food supply chain company was conducted to verify the feasibility of its functionality in a real environment.

Keywords: fresh food supply chain, fuzzy logic, hierarchical holographic modelling, operationalization, supply chain risk

Procedia PDF Downloads 236
5370 Contour Defects of Face with Hyperpigmentation

Authors: Afzaal Bashir, Sunaina Afzaal

Abstract:

Background: Facial contour deformities associated with pigmentary changes are of major concern for plastic surgeons, both being important and difficult in treating such issues. No definite ideal treatment option is available to simultaneously address both the contour defect as well as related pigmentation. Objectives: The aim of the current study is to compare the long-term effects of conventional adipose tissue grafting and ex-vivo expanded Mesenchymal Stem Cells enriched adipose tissue grafting for the treatment of contour deformities with pigmentary changes on the face. Material and Methods: In this study, eighty (80) patients with contour deformities of the face with hyperpigmentation were recruited after informed consent. Two techniques i.e., conventional fat grafting (C-FG) and fat grafts enriched with expanded adipose stem cells (FG-ASCs), were used to address the pigmentation. Both techniques were explained to patients, and enrolled patients were divided into two groups i.e., C-FG and FG-ASCS, per patients’ choice and satisfaction. Patients of the FG-ASCs group were treated with fat grafts enriched with expanded adipose stem cells, while patients of the C-FGs group were treated with conventional fat grafting. Patients were followed for 12 months, and improvement in face pigmentation was assessed clinically as well as measured objectively. Patient satisfaction was also documented as highly satisfied, satisfied, and unsatisfied. Results: Mean age of patients was 24.42(±4.49), and 66 patients were females. The forehead was involved in 61.20% of cases, the cheek in 21.20% of cases, the chin in 11.20% of cases, and the nose in 6.20% of cases. In the GF-ASCs group, the integrated color density (ICD) was decreased (1.08×10⁶ ±4.64×10⁵) as compared to the C-FG group (2.80×10⁵±1.69×10⁵). Patients treated with fat grafts enriched with expanded adipose stem cells were significantly more satisfied as compared to patients treated with conventional fat grafting only. Conclusion: Mesenchymal stem cell-enriched autologous fat grafting is a preferred option for improving the contour deformities related to increased pigmentation of face skin.

Keywords: hyperpigmentation, color density, enriched adipose tissue graft, fat grafting, contour deformities, Image J

Procedia PDF Downloads 103
5369 Teaching Turn-Taking Rules and Pragmatic Principles to Empower EFL Students and Enhance Their Learning in Speaking Modules

Authors: O. F. Elkommos

Abstract:

Teaching and learning EFL speaking modules is one of the most challenging productive modules for both instructors and learners. In a student-centered interactive communicative language teaching approach, learners and instructors should be aware of the fact that the target language must be taught as/for communication. The student must be empowered by tools that will work on more than one level of their communicative competence. Communicative learning will need a teaching and learning methodology that will address the goal. Teaching turn-taking rules, pragmatic principles and speech acts will enhance students' sociolinguistic competence, strategic competence together with discourse competence. Sociolinguistic competence entails the mastering of speech act conventions and illocutionary acts of refusing, agreeing/disagreeing; emotive acts like, thanking, apologizing, inviting, offering; directives like, ordering, requesting, advising, and hinting, among others. Strategic competence includes enlightening students’ consciousness of the various particular turn-taking systemic rules of organizing techniques of opening and closing conversation, adjacency pairs, interrupting, back-channeling, asking for/giving opinion, agreeing/disagreeing, using natural fillers for pauses, gaps, speaker select, self-select, and silence among others. Students will have the tools to manage a conversation. Students are engaged in opportunities of experiencing the natural language not as a mere extra student talking time but rather an empowerment of knowing and using the strategies. They will have the component items they need to use as well as the opportunity to communicate in the target language using topics of their interest and choice. This enhances students' communicative abilities. Available websites and textbooks now use one or more of these tools of turn-taking or pragmatics. These will be students' support in self-study in their independent learning study hours. This will be their reinforcement practice on e-Learning interactive activities. The students' target is to be able to communicate the intended meaning to an addressee that is in turn able to infer that intended meaning. The combination of these tools will be assertive and encouraging to the student to beat the struggle with what to say, how to say it, and when to say it. Teaching the rules, principles and techniques is an act of awareness raising method engaging students in activities that will lead to their pragmatic discourse competence. The aim of the paper is to show how the suggested pragmatic model will empower students with tools and systems that would support their learning. Supporting students with turn taking rules, speech act theory, applying both to texts and practical analysis and using it in speaking classes empowers students’ pragmatic discourse competence and assists them to understand language and its context. They become more spontaneous and ready to learn the discourse pragmatic dimension of the speaking techniques and suitable content. Students showed a better performance and a good motivation to learn. The model is therefore suggested for speaking modules in EFL classes.

Keywords: communicative competence, EFL, empowering learners, enhance learning, speech acts, teaching speaking, turn taking, learner centred, pragmatics

Procedia PDF Downloads 169
5368 An Investigation on the Pulse Electrodeposition of Ni-TiO2/TiO2 Multilayer Structures

Authors: S. Mohajeri

Abstract:

Electrocodeposition of Ni-TiO2 nanocomposite single layers and Ni-TiO2/TiO2 multilayers from Watts bath containing TiO2 sol was carried out on copper substrate. Pulse plating and pulse reverse plating techniques were applied to facilitate higher incorporations of TiO2 nanoparticles in Ni-TiO2 nanocomposite single layers, and the results revealed that by prolongation of the current-off durations and the anodic cycles, deposits containing 11.58 wt.% and 13.16 wt.% TiO2 were produced, respectively. Multilayer coatings which consisted of Ni-TiO2 and TiO2-rich layers were deposited by pulse potential deposition through limiting the nickel deposition by diffusion control mechanism. The TiO2-rich layers thickness and accordingly, the content of TiO2 reinforcement reached 104 nm and 18.47 wt.%, respectively in the optimum condition. The phase structure and surface morphology of the nanocomposite coatings were characterized by X-ray diffraction (XRD) and scanning electron microscopy (SEM). The cross sectional morphology and line scans of the layers were studied by field emission scanning electron microscopy (FESEM). It was confirmed that the preferred orientations and the crystallite sizes of nickel matrix were influenced by the deposition technique parameters, and higher contents of codeposited TiO2 nanoparticles refined the microstructure. The corrosion behavior of the coatings in 1M NaCl and 0.5M H2SO4 electrolytes were compared by means of potentiodynamic polarization and electrochemical impedance spectroscopy (EIS) techniques. Increase of corrosion resistance and the passivation tendency were favored by TiO2 incorporation, while the degree of passivation declined as embedded particles disturbed the continuity of passive layer. The role of TiO2 incorporation on the improvement of mechanical properties including hardness, elasticity, scratch resistance and friction coefficient was investigated by the means of atomic force microscopy (AFM). Hydrophilicity and wettability of the composite coatings were investigated under UV illumination, and the water contact angle of the multilayer was reduced to 7.23° after 1 hour of UV irradiation.

Keywords: electrodeposition, hydrophilicity, multilayer, pulse-plating

Procedia PDF Downloads 246
5367 Analyzing Impacts of Road Network on Vegetation Using Geographic Information System and Remote Sensing Techniques

Authors: Elizabeth Malebogo Mosepele

Abstract:

Road transport has become increasingly common in the world; people rely on road networks for transportation purpose on a daily basis. However, environmental impact of roads on surrounding landscapes extends their potential effects even further. This study investigates the impact of road network on natural vegetation. The study will provide baseline knowledge regarding roadside vegetation and would be helpful in future for conservation of biodiversity along the road verges and improvements of road verges. The general hypothesis of this study is that the amount and condition of road side vegetation could be explained by road network conditions. Remote sensing techniques were used to analyze vegetation conditions. Landsat 8 OLI image was used to assess vegetation cover condition. NDVI image was generated and used as a base from which land cover classes were extracted, comprising four categories viz. healthy vegetation, degraded vegetation, bare surface, and water. The classification of the image was achieved using the supervised classification technique. Road networks were digitized from Google Earth. For observed data, transect based quadrats of 50*50 m were conducted next to road segments for vegetation assessment. Vegetation condition was related to road network, with the multinomial logistic regression confirming a significant relationship between vegetation condition and road network. The null hypothesis formulated was that 'there is no variation in vegetation condition as we move away from the road.' Analysis of vegetation condition revealed degraded vegetation within close proximity of a road segment and healthy vegetation as the distance increase away from the road. The Chi Squared value was compared with critical value of 3.84, at the significance level of 0.05 to determine the significance of relationship. Given that the Chi squared value was 395, 5004, the null hypothesis was therefore rejected; there is significant variation in vegetation the distance increases away from the road. The conclusion is that the road network plays an important role in the condition of vegetation.

Keywords: Chi squared, geographic information system, multinomial logistic regression, remote sensing, road side vegetation

Procedia PDF Downloads 425
5366 Spatial Analysis of the Impact of City Developments Degradation of Green Space in Urban Fringe Eastern City of Yogyakarta Year 2005-2010

Authors: Pebri Nurhayati, Rozanah Ahlam Fadiyah

Abstract:

In the development of the city often use rural areas that can not be separated from the change in land use that lead to the degradation of urban green space in the city fringe. In the long run, the degradation of green open space this can impact on the decline of ecological, psychological and public health. Therefore, this research aims to (1) determine the relationship between the parameters of the degradation rate of urban development with green space, (2) develop a spatial model of the impact of urban development on the degradation of green open space with remote sensing techniques and Geographical Information Systems in an integrated manner. This research is a descriptive research with data collection techniques of observation and secondary data . In the data analysis, to interpret the direction of urban development and degradation of green open space is required in 2005-2010 ASTER image with NDVI. Of interpretation will generate two maps, namely maps and map development built land degradation green open space. Secondary data related to the rate of population growth, the level of accessibility, and the main activities of each city map is processed into a population growth rate, the level of accessibility maps, and map the main activities of the town. Each map is used as a parameter to map the degradation of green space and analyzed by non-parametric statistical analysis using Crosstab thus obtained value of C (coefficient contingency). C values were then compared with the Cmaximum to determine the relationship. From this research will be obtained in the form of modeling spatial map of the City Development Impact Degradation Green Space in Urban Fringe eastern city of Yogyakarta 2005-2010. In addition, this research also generate statistical analysis of the test results of each parameter to the degradation of green open space in the Urban Fringe eastern city of Yogyakarta 2005-2010.

Keywords: spatial analysis, urban development, degradation of green space, urban fringe

Procedia PDF Downloads 306
5365 Study of the ZnO Effect on the Properties of HDPE/ ZnO Nanocomposites

Authors: F. Z. Benabid, F. Zouai, N. Kharchi, D. Benachour

Abstract:

A HDPE/ZnO nano composites have been successfully performed using the co-mixing. The ZnO was first co-mixed with the stearic acid then added to the polymer in the plastograph. The nano composites prepared with the co-mixed ZnO were compared to those prepared with the neat TiO2. The nano composites were characterized by different techniques as the wide-angle X-ray scattering (WAXS). The micro and nano structure/properties relationships were investigated. The present study allowed establishing good correlations between the different measured properties.

Keywords: exfoliation, ZnO, nano composites, HDPE, co-mixing

Procedia PDF Downloads 342
5364 Implementation and Challenges of Assessment Methods in the Case of Physical Education Class in Some Selected Preparatory Schools of Kirkos Sub-City

Authors: Kibreab Alene Fenite

Abstract:

The purpose of this study is to investigate the implementation and challenges of different assessment methods for physical education class in some selected preparatory schools of kirkos sub city. The participants in this study are teachers, students, department heads and school principals from 4 selected schools. Of the total 8 schools offering in kirkos sub city 4 schools (Dandi Boru, Abiyot Kirse, Assay, and Adey Ababa) are selected by using simple random sampling techniques and from these schools all (100%) of teachers, 100% of department heads and school principals are taken as a sample as their number is manageable. From the total 2520 students, 252 (10%) of students are selected using simple random sampling. Accordingly, 13 teachers, 252 students, 4 department heads and 4 school principals are taken as a sample from the 4 selected schools purposefully. As a method of data gathering tools; questionnaire and interview are employed. To analyze the collected data, both quantitative and qualitative methods are used. The result of the study revealed that assessment in physical education does not implement properly: lack of sufficient materials, inadequate time allotment, large class size, and lack of collaboration and working together of teachers towards assessing the performance of students, absence of guidelines to assess the physical education subject, no different assessment method that is implementing on students with disabilities in line with their special need are found as major challenges in implementing the current assessment method of physical education. To overcome these problems the following recommendations have been forwarded. These are: the necessary facilities and equipment should be available; In order to make reliable, accurate, objective and relevant assessment, teachers of physical education should be familiarized with different assessment techniques; Physical education assessment guidelines should be prepared, and guidelines should include different types of assessment methods; qualified teachers should be employed, and different teaching room must be build.

Keywords: assessment, challenges, equipment, guidelines, implementation, performance

Procedia PDF Downloads 277
5363 Structural Properties of Surface Modified PVA: Zn97Pr3O Polymer Nanocomposite Free Standing Films

Authors: Pandiyarajan Thangaraj, Mangalaraja Ramalinga Viswanathan, Karthikeyan Balasubramanian, Héctor D. Mansilla, José Ruiz

Abstract:

Rare earth ions doped semiconductor nanostructures gained much attention due to their novel physical and chemical properties which lead to potential applications in laser technology as inexpensive luminescent materials. Doping of rare earth ions into ZnO semiconductor alter its electronic structure and emission properties. Surface modification (polymer covering) is one of the simplest techniques to modify the emission characteristics of host materials. The present work reports the synthesis and structural properties of PVA:Zn97Pr3O polymer nanocomposite free standing films. To prepare Pr3+ doped ZnO nanostructures and PVA:Zn97Pr3O polymer nanocomposite free standing films, the colloidal chemical and solution casting techniques were adopted, respectively. The formation of PVA:Zn97Pr3O films were confirmed through X-ray diffraction (XRD), absorption and Fourier transform infrared (FTIR) spectroscopy analyses. XRD measurements confirm the prepared materials are crystalline having hexagonal wurtzite structure. Polymer composite film exhibits the diffraction peaks of both PVA and ZnO structures. TEM images reveal the pure and Pr3+ doped ZnO nanostructures exhibit sheet like morphology. Optical absorption spectra show free excitonic absorption band of ZnO at 370 nm and, the PVA:Zn97Pr3O polymer film shows absorption bands at ~282 and 368 nm and these arise due to the presence of carbonyl containing structures connected to the PVA polymeric chains, mainly at the ends and free excitonic absorption of ZnO nanostructures, respectively. Transmission spectrum of as prepared film shows 57 to 69% of transparency in the visible and near IR region. FTIR spectral studies confirm the presence of A1 (TO) and E1 (TO) modes of Zn-O bond vibration and the formation of polymer composite materials.

Keywords: rare earth doped ZnO, polymer composites, structural characterization, surface modification

Procedia PDF Downloads 359
5362 Indian Premier League (IPL) Score Prediction: Comparative Analysis of Machine Learning Models

Authors: Rohini Hariharan, Yazhini R, Bhamidipati Naga Shrikarti

Abstract:

In the realm of cricket, particularly within the context of the Indian Premier League (IPL), the ability to predict team scores accurately holds significant importance for both cricket enthusiasts and stakeholders alike. This paper presents a comprehensive study on IPL score prediction utilizing various machine learning algorithms, including Support Vector Machines (SVM), XGBoost, Multiple Regression, Linear Regression, K-nearest neighbors (KNN), and Random Forest. Through meticulous data preprocessing, feature engineering, and model selection, we aimed to develop a robust predictive framework capable of forecasting team scores with high precision. Our experimentation involved the analysis of historical IPL match data encompassing diverse match and player statistics. Leveraging this data, we employed state-of-the-art machine learning techniques to train and evaluate the performance of each model. Notably, Multiple Regression emerged as the top-performing algorithm, achieving an impressive accuracy of 77.19% and a precision of 54.05% (within a threshold of +/- 10 runs). This research contributes to the advancement of sports analytics by demonstrating the efficacy of machine learning in predicting IPL team scores. The findings underscore the potential of advanced predictive modeling techniques to provide valuable insights for cricket enthusiasts, team management, and betting agencies. Additionally, this study serves as a benchmark for future research endeavors aimed at enhancing the accuracy and interpretability of IPL score prediction models.

Keywords: indian premier league (IPL), cricket, score prediction, machine learning, support vector machines (SVM), xgboost, multiple regression, linear regression, k-nearest neighbors (KNN), random forest, sports analytics

Procedia PDF Downloads 39
5361 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs

Authors: Junaid Mahmood Alam

Abstract:

Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.

Keywords: revalidation, standardized, IFCC, CAP, harmonized

Procedia PDF Downloads 266
5360 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance

Authors: Sokkhey Phauk, Takeo Okazaki

Abstract:

The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.

Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance

Procedia PDF Downloads 103
5359 Comparative Analysis of the Performance Between Public and Private Companies: Explanatory Factors

Authors: Atziri Moreno Vite, David Silva Gutiérrez

Abstract:

Oil companies have become the key player in the world energy scenario thanks to their strong control of the level of hydrocarbon reserves and production. The present research aims to identify the main factors that explain the results of these companies through an in-depth review of the specialized literature and to analyze the results of these companies by means of econometric analysis with techniques such as Data Envelopment Analysis (DEA). The results show the relevance and impact of factors such as the level of employment or investment of the company.

Keywords: oil companies, performance, determinants, productive

Procedia PDF Downloads 117
5358 Study of Land Use Changes around an Archaeological Site Using Satellite Imagery Analysis: A Case Study of Hathnora, Madhya Pradesh, India

Authors: Pranita Shivankar, Arun Suryawanshi, Prabodhachandra Deshmukh, S. V. C. Kameswara Rao

Abstract:

Many undesirable significant changes in landscapes and the regions in the vicinity of historically important structures occur as impacts due to anthropogenic activities over a period of time. A better understanding of such influences using recently developed satellite remote sensing techniques helps in planning the strategies for minimizing the negative impacts on the existing environment. In 1982, a fossilized hominid skull cap was discovered at a site located along the northern bank of the east-west flowing river Narmada in the village Hathnora. Close to the same site, the presence of Late Acheulian and Middle Palaeolithic tools have been discovered in the immediately overlying pebbly gravel, suggesting that the ‘Narmada skull’ may be from the Middle Pleistocene age. The reviews of recently carried out research studies relevant to hominid remains all over the world from Late Acheulian and Middle Palaeolithic sites suggest succession and contemporaneity of cultures there, enhancing the importance of Hathnora as a rare precious site. In this context, the maximum likelihood classification using digital interpretation techniques was carried out for this study area using the satellite imagery from Landsat ETM+ for the year 2006 and Landsat TM (OLI and TIRS) for the year 2016. The overall accuracy of Land Use Land Cover (LULC) classification of 2016 imagery was around 77.27% based on ground truth data. The significant reduction in the main river course and agricultural activities and increase in the built-up area observed in remote sensing data analysis are undoubtedly the outcome of human encroachments in the vicinity of the eminent heritage site.

Keywords: cultural succession, digital interpretation, Hathnora, Homo Sapiens, Late Acheulian, Middle Palaeolithic

Procedia PDF Downloads 165
5357 Examining Statistical Monitoring Approach against Traditional Monitoring Techniques in Detecting Data Anomalies during Conduct of Clinical Trials

Authors: Sheikh Omar Sillah

Abstract:

Introduction: Monitoring is an important means of ensuring the smooth implementation and quality of clinical trials. For many years, traditional site monitoring approaches have been critical in detecting data errors but not optimal in identifying fabricated and implanted data as well as non-random data distributions that may significantly invalidate study results. The objective of this paper was to provide recommendations based on best statistical monitoring practices for detecting data-integrity issues suggestive of fabrication and implantation early in the study conduct to allow implementation of meaningful corrective and preventive actions. Methodology: Electronic bibliographic databases (Medline, Embase, PubMed, Scopus, and Web of Science) were used for the literature search, and both qualitative and quantitative studies were sought. Search results were uploaded into Eppi-Reviewer Software, and only publications written in the English language from 2012 were included in the review. Gray literature not considered to present reproducible methods was excluded. Results: A total of 18 peer-reviewed publications were included in the review. The publications demonstrated that traditional site monitoring techniques are not efficient in detecting data anomalies. By specifying project-specific parameters such as laboratory reference range values, visit schedules, etc., with appropriate interactive data monitoring, statistical monitoring can offer early signals of data anomalies to study teams. The review further revealed that statistical monitoring is useful to identify unusual data patterns that might be revealing issues that could impact data integrity or may potentially impact study participants' safety. However, subjective measures may not be good candidates for statistical monitoring. Conclusion: The statistical monitoring approach requires a combination of education, training, and experience sufficient to implement its principles in detecting data anomalies for the statistical aspects of a clinical trial.

Keywords: statistical monitoring, data anomalies, clinical trials, traditional monitoring

Procedia PDF Downloads 67
5356 Aerial Photogrammetry-Based Techniques to Rebuild the 30-Years Landform Changes of a Landslide-Dominated Watershed in Taiwan

Authors: Yichin Chen

Abstract:

Taiwan is an island characterized by an active tectonics and high erosion rates. Monitoring the dynamic landscape of Taiwan is an important issue for disaster mitigation, geomorphological research, and watershed management. Long-term and high spatiotemporal landform data is essential for quantifying and simulating the geomorphological processes and developing warning systems. Recently, the advances in unmanned aerial vehicle (UAV) and computational photogrammetry technology have provided an effective way to rebuild and monitor the topography changes in high spatio-temporal resolutions. This study rebuilds the 30-years landform change in the Aiyuzi watershed in 1986-2017 by using the aerial photogrammetry-based techniques. The Aiyuzi watershed, located in central Taiwan and has an area of 3.99 Km², is famous for its frequent landslide and debris flow disasters. This study took the aerial photos by using UAV and collected multi-temporal historical, stereo photographs, taken by the Aerial Survey Office of Taiwan’s Forestry Bureau. To rebuild the orthoimages and digital surface models (DSMs), Pix4DMapper, a photogrammetry software, was used. Furthermore, to control model accuracy, a set of ground control points was surveyed by using eGPS. The results show that the generated DSMs have the ground sampling distance (GSD) of ~10 cm and ~0.3 cm from the UAV’s and historical photographs, respectively, and vertical error of ~1 m. By comparing the DSMs, there are many deep-seated landslides (with depth over 20 m) occurred on the upstream in the Aiyuzi watershed. Even though a large amount of sediment is delivered from the landslides, the steep main channel has sufficient capacity to transport sediment from the channel and to erode the river bed to ~20 m in depth. Most sediments are transported to the outlet of watershed and deposits on the downstream channel. This case study shows that UAV and photogrammetry technology are useful for topography change monitoring effectively.

Keywords: aerial photogrammetry, landslide, landform change, Taiwan

Procedia PDF Downloads 153
5355 Comparison of Regional and Local Indwelling Catheter Techniques to Prolong Analgesia in Total Knee Arthroplasty Procedures: Continuous Peripheral Nerve Block and Continuous Periarticular Infiltration

Authors: Jared Cheves, Amanda DeChent, Joyce Pan

Abstract:

Total knee replacements (TKAs) are one of the most common but painful surgical procedures performed in the United States. Currently, the gold standard for postoperative pain management is the utilization of opioids. However, in the wake of the opioid epidemic, the healthcare system is attempting to reduce opioid consumption by trialing innovative opioid sparing analgesic techniques such as continuous peripheral nerve blocks (CPNB) and continuous periarticular infiltration (CPAI). The alleviation of pain, particularly during the first 72 hours postoperatively, is of utmost importance due to its association with delayed recovery, impaired rehabilitation, immunosuppression, the development of chronic pain, the development of rebound pain, and decreased patient satisfaction. While both CPNB and CPAI are being used today, there is limited evidence comparing the two to the current standard of care or to each other. An extensive literature review was performed to explore the safety profiles and effectiveness of CPNB and CPAI in reducing reported pain scores and decreasing opioid consumption. The literature revealed the usage of CPNB contributed to lower pain scores and decreased opioid use when compared to opioid-only control groups. Additionally, CPAI did not improve pain scores or decrease opioid consumption when combined with a multimodal analgesic (MMA) regimen. When comparing CPNB and CPAI to each other, neither unanimously lowered pain scores to a greater degree, but the literature indicates that CPNB decreased opioid consumption more than CPAI. More research is needed to further cement the efficacy of CPNB and CPAI as standard components of MMA in TKA procedures. In addition, future research can also focus on novel catheter-free applications to reduce the complications of continuous catheter analgesics.

Keywords: total knee arthroplasty, continuous peripheral nerve blocks, continuous periarticular infiltration, opioid, multimodal analgesia

Procedia PDF Downloads 87
5354 Long-Term Results of Coronary Bifurcation Stenting with Drug Eluting Stents

Authors: Piotr Muzyk, Beata Morawiec, Mariusz Opara, Andrzej Tomasik, Brygida Przywara-Chowaniec, Wojciech Jachec, Ewa Nowalany-Kozielska, Damian Kawecki

Abstract:

Background: Coronary bifurcation is one of the most complex lesion in patients with coronary ar-tery disease. Provisional T-stenting is currently one of the recommended techniques. The aim was to assess optimal methods of treatment in the era of drug-eluting stents (DES). Methods: The regis-try consisted of data from 1916 patients treated with coronary percutaneous interventions (PCI) using either first- or second-generation DES. Patients with bifurcation lesion entered the analysis. Major adverse cardiac and cardiovascular events (MACCE) were assessed at one year of follow-up and comprised of death, acute myocardial infarction (AMI), repeated PCI (re-PCI) of target ves-sel and stroke. Results: Of 1916 registry patients, 204 patients (11%) were diagnosed with bifurcation lesion >50% and entered the analysis. The most commonly used technique was provi-sional T-stenting (141 patients, 69%). Optimization with kissing-balloons technique was performed in 45 patients (22%). In 59 patients (29%) second-generation DES was implanted, while in 112 pa-tients (55%), first-generation DES was used. In 33 patients (16%) both types of DES were used. The procedure success rate (TIMI 3 flow) was achieved in 98% of patients. In one-year follow-up, there were 39 MACCE (19%) (9 deaths, 17 AMI, 16 re-PCI and 5 strokes). Provisional T-stenting resulted in similar rate of MACCE to other techniques (16% vs. 5%, p=0.27) and similar occurrence of re-PCI (6% vs. 2%, p=0.78). The results of post-PCI kissing-balloon technique gave equal out-comes with 3% vs. 16% of MACCE in patients in whom no optimization technique was used (p=0.39). The type of implanted DES (second- vs. first-generation) had no influence on MACCE (4% vs 14%, respectively, p=0.12) and re-PCI (1.7% vs. 51% patients, respectively, p=0.28). Con-clusions: The treatment of bifurcation lesions with PCI represent high-risk procedures with high rate of MACCE. Stenting technique, optimization of PCI and the generation of implanted stent should be personalized for each case to balance risk of the procedure. In this setting, the operator experience might be the factor of better outcome, which should be further investigated.

Keywords: coronary bifurcation, drug eluting stents, long-term follow-up, percutaneous coronary interventions

Procedia PDF Downloads 200
5353 Comparison of Inexpensive Cell Disruption Techniques for an Oleaginous Yeast

Authors: Scott Nielsen, Luca Longanesi, Chris Chuck

Abstract:

Palm oil is obtained from the flesh and kernel of the fruit of oil palms and is the most productive and inexpensive oil crop. The global demand for palm oil is approximately 75 million metric tonnes, a 29% increase in global production of palm oil since 2016. This expansion of oil palm cultivation has resulted in mass deforestation, vast biodiversity destruction and increasing net greenhouse gas emissions. One possible alternative is to produce a saturated oil, similar to palm, from microbes such as oleaginous yeast. The yeasts can be cultured on sugars derived from second-generation sources and do not compete with tropical forests for land. One highly promising oleaginous yeast for this application is Metschnikowia pulcherrima. However, recent techno-economic modeling has shown that cell lysis and standard lipid extraction are major contributors to the cost of the oil. Typical cell disruption techniques to extract either single cell oils or proteins have been based around bead-beating, homogenization and acid lysis. However, these can have a detrimental effect on lipid quality and are energy-intensive. In this study, a vortex separator, which produces high sheer with minimal energy input, was investigated as a potential low energy method of lysing cells. This was compared to four more traditional methods (thermal lysis, acid lysis, alkaline lysis, and osmotic lysis). For each method, the yeast loading was also examined at 1 g/L, 10 g/L and 100 g/L. The quality of the cell disruption was measured by optical cell density, cell counting and the particle size distribution profile comparison over a 2-hour period. This study demonstrates that the vortex separator is highly effective at lysing the cells and could potentially be used as a simple apparatus for lipid recovery in an oleaginous yeast process. The further development of this technology could potentially reduce the overall cost of microbial lipids in the future.

Keywords: palm oil substitute, metschnikowia pulcherrima, cell disruption, cell lysis

Procedia PDF Downloads 196
5352 The Strategy for Detection of Catecholamines in Body Fluids: Optical Sensor

Authors: Joanna Cabaj, Sylwia Baluta, Karol Malecha, Kamila Drzozga

Abstract:

Catecholamines are the principal neurotransmitters that mediate a variety of the central nervous system functions, such as motor control, cognition, emotion, memory processing, and endocrine modulation. Dysfunctions in catecholamine neurotransmission are induced in some neurologic and neuropsychiatric diseases. Changeable neurotransmitters level in biological fluids can be a marker of several neurological disorders. Because of its significance in analytical techniques and diagnostics, sensitive and selective detection of neurotransmitters is increasingly attracting a lot of attention in different areas of bio-analysis or biomedical research. Recently, fluorescent techniques for detection of catecholamines have attracted interests due to their reasonable cost, convenient control, as well as maneuverability in biological environments. Nevertheless, with the observed need for a sensitive and selective catecholamines sensor, the development of a convenient method for this neurotransmitter is still at its basic level. The manipulation of nanostructured materials in conjunction with biological molecules has led to the development of a new class of hybrid modified biosensors in which both enhancement of charge transport and biological activity preservation may be obtained. Immobilization of biomaterials on electrode surfaces is the crucial step in fabricating electrochemical as well as optical biosensors and bioelectronic devices. Continuing systematic investigation in the manufacturing of enzyme–conducting sensitive systems, here is presented a convenient fluorescence sensing strategy for catecholamines detection based on FRET (fluorescence resonance energy transfer) phenomena observed for, i.e., complexes of Fe²⁺ and epinephrine. The biosensor was constructed using low temperature co-fired ceramics technology (LTCC). This sensing system used the catalytical oxidation of catecholamines and quench of the strong luminescence of obtained complexes due to FRET. The detection process was based on the oxidation of substrate in the presence of the enzyme–laccase/tyrosinase.

Keywords: biosensor, conducting polymer, enzyme, FRET, LTCC

Procedia PDF Downloads 253