Search results for: general data protection regulation
21655 Pre-Service Teachers’ Opinions on Disabled People
Authors: Sinem Toraman, Aysun Öztuna Kaplan, Hatice Mertoğlu, Esra Macaroğlu Akgül
Abstract:
This study aims to examine pre-service teachers’ opinions on disabled people taking into consideration various variables. The participants of the study are composed of 170 pre-service teachers being 1st year students of different branches at Education Department of Yıldız Technical, Yeditepe, Marmara and Sakarya Universities. Data of the research was collected in 2013-2014 fall term. This study was designed as a phenomenological study appropriately qualitative research paradigm. Pre-service teachers’ opinions about disabled people were examined in this study, open ended question form which was prepared by researcher and focus group interview techniques were used as data collection tool. The study presents pre-service teachers’ opinions about disabled people which were mentioned, and suggestions about teacher education.Keywords: pre-service teachers, disabled people, teacher education, teachers' opinions
Procedia PDF Downloads 45821654 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation
Authors: Serge B. Provost, Yishan Zhang
Abstract:
A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation
Procedia PDF Downloads 16221653 Performance Evaluation of 3D Printed ZrO₂ Ceramic Components by Nanoparticle Jetting™
Authors: Shengping Zhong, Qimin Shi, Yaling Deng, Shoufeng Yang
Abstract:
Additive manufacturing has exerted a tremendous fascination on the development of the manufacturing and materials industry in the past three decades. Zirconia-based advanced ceramic has been poured substantial attention in the interest of structural and functional ceramics. As a novel material jetting process for selectively depositing nanoparticles, NanoParticle Jetting™ is capable of fabricating dense zirconia components with a high-detail surface, precisely controllable shrinkage, and remarkable mechanical properties. The presence of NPJ™ gave rise to a higher elevation regarding the printing process and printing accuracy. Emphasis is placed on the performance evaluation of NPJ™ printed ceramic components by which the physical, chemical, and mechanical properties are evaluated. The experimental results suggest the Y₂O₃-stabilized ZrO₂ boxes exhibit a high relative density of 99.5%, glossy surface of minimum 0.33 µm, general linear shrinkage factor of 17.47%, outstanding hardness and fracture toughness of 12.43±0.09 GPa and 7.52±0.34 MPa·m¹/², comparable flexural strength of 699±104 MPa, and dense and homogeneous grain distribution of microstructure. This innovative NanoParticle Jetting system manifests an overwhelming potential in dental, medical, and electronic applications.Keywords: nanoparticle jetting, ZrO₂ ceramic, materials jetting, performance evaluation
Procedia PDF Downloads 17721652 Implementation of an Autonomous Driving, On-Demand Bus System for Public Transportation
Authors: Eric Neidhardt
Abstract:
A well-functioning public transport system that is accepted and used by the general population contributes a lot to a sustainable city. Especially young and elderly people rely on public transport to get to work, go shopping, visit a doctor, and take advantage of entertainment options. The sustainability of a public transport system can be considered from different points of view. In urban areas, acceptance is particularly important. As many people as possible should use public transport and not their private vehicle. This reduces traffic jams and increases air quality. In rural areas, the cost efficiency of public transport is especially important. Longer distances and a low population density mean that these modes of transportation can rarely be used cost-effectively. It is crucial to avoid a low utilization, because empty rides are neither sustainable nor cost-effective. With a demand-oriented approach, we try to both improve flexibility and therefore attractiveness for the user and improve cost- efficiency. The vehicles only operate when they are needed and only where they are needed. Empty rides are avoided to improve sustainability. In the subproject "Autonomous public driving" of the project RealLabHH, such a system was implemented and tested in Hamburg-Bergedorf, a suburb of Hamburg. In this paper, some of the steps necessary for this are considered from a technical point of view, and problems that arose in real-life use are addressed.Keywords: public transport, demand-oriented, autonomous driving, RealLabHH
Procedia PDF Downloads 19321651 Demystifying Mathematics: Handling Learning Disabilities in Mathematics Among Low Achievers in Kenyan Schools
Authors: Gladys Gakenia Njoroge
Abstract:
Mathematics is a compulsory subject in both primary and secondary schools in Kenya. However, learners’ poor performance in the subject in Kenya national examinations year in year out remains a serious concern for teachers of Mathematics, parents, curriculum developers, and the general public. This is particularly worrying because of the importance attached to the subject in national development hence the need to find out what could be affecting learning of Mathematics in Kenyan schools. The research on which this paper is based sought to examine the factors that influence performance in Mathematics in Kenyan schools; identify the characteristics of Mathematics learning disabilities; determine how the learners with such learning disabilities can be assessed and identified and interventions for these difficulties implemented. A case study was undertaken on class six learners in a primary school in Nairobi County. The tools used for the research were: classroom observations and an Individualized Education Program (IEP) developed by the teachers with the help of the researcher. This paper therefore highlights the findings from the research, discusses the implications of the findings and suggests the way forward as far as teaching, learning and assessment of Mathematics in Kenyan schools is concerned. Perhaps with the application of the right interventions, poor performance in Mathematics in the national examinations in Kenya will be a thing of the past.Keywords: demystifying mathematics, individualized education program, learning difficulties, assessment
Procedia PDF Downloads 9221650 Rabies Surveillance Data Analysis in Addis Ababa, Ethiopia during 2012/13: Retrospective Cross Sectional Study
Authors: Fantu Lombamo Untiso, Sylvia Murphy, Emily Pieracci
Abstract:
Background: Rabies is a highly fatal viral disease of all warm-blooded animals including human globally. However, effective rabies control program still remains to be a reality and needs to be strengthened. Objective: Reviewing of recorded data and analyzing it to generate information on the status of rabies in Addis Ababa in the year 2012/13. Methods: A retrospective data were used from the Ethiopian Public Health Institute rabies case record book registered in the year 2012/13. Results: Among 1357 suspected rabid animals clinically examined; only 8.84% were positive for rabies. Out of 216 animal brains investigated in the laboratory with Fluorescent Antibody Technique, 55.5% were confirmed rabies positive. Among the laboratory confirmed positive rabies cases, high percentage of the animals came from Yeka (20%) and lower number from Kirkos subcity (3.3%). Out of 1149 humans who came to the institute seeking anti-rabies post-exposure prophylaxis, 85.65% and 7.87% of them were exposed to suspected dogs and cats respectively. 3 human deaths due to rabies were reported in the year after exposure to dog bite of unknown vaccination status. Conclusion: The principal vector of rabies in Addis Ababa is dog. Effective rabies management and control based on confirmed cases and mass-immunization and control of stray dog populations is recommended.Keywords: Addis Ababa, exposure, rabies, surveillance
Procedia PDF Downloads 17921649 Modelling and Simulation of Diffusion Effect on the Glycol Dehydration Unit of a Natural Gas Plant
Authors: M. Wigwe, J. G Akpa, E. N Wami
Abstract:
Mathematical models of the absorber of a glycol dehydration facility was developed using the principles of conservation of mass and energy. Models which predict variation of the water content of gas in mole fraction, variation of gas and liquid temperatures across the parking height were developed. These models contain contributions from bulk and diffusion flows. The effect of diffusion on the process occurring in the absorber was studied in this work. The models were validated using the initial conditions in the plant data from Company W TEG unit in Nigeria. The results obtained showed that the effect of diffusion was noticed between z=0 and z=0.004 m. A deviation from plant data of 0% was observed for the gas water content at a residence time of 20 seconds, at z=0.004 m. Similarly, deviations of 1.584% and 2.844% were observed for the gas and TEG temperatures.Keywords: separations, absorption, simulation, dehydration, water content, triethylene glycol
Procedia PDF Downloads 49921648 Applying Critical Realism to Qualitative Social Work Research: A Critical Realist Approach for Social Work Thematic Analysis Method
Authors: Lynne Soon-Chean Park
Abstract:
Critical Realism (CR) has emerged as an alternative to both the positivist and constructivist perspectives that have long dominated social work research. By unpacking the epistemic weakness of two dogmatic perspectives, CR provides a useful philosophical approach that incorporates the ontological objectivist and subjectivist stance. The CR perspective suggests an alternative approach for social work researchers who have long been looking to engage in the complex interplay between perceived reality at the empirical level and the objective reality that lies behind the empirical event as a causal mechanism. However, despite the usefulness of CR in informing social work research, little practical guidance is available about how CR can inform methodological considerations in social work research studies. This presentation aims to provide a detailed description of CR-informed thematic analysis by drawing examples from a social work doctoral research of Korean migrants’ experiences and understanding of trust associated with their settlement experience in New Zealand. Because of its theoretical flexibility and accessibility as a qualitative analysis method, thematic analysis can be applied as a method that works both to search for the demi-regularities of the collected data and to identify the causal mechanisms that lay behind the empirical data. In so doing, this presentation seeks to provide a concrete and detailed exemplar for social work researchers wishing to employ CR in their qualitative thematic analysis process.Keywords: critical Realism, data analysis, epistemology, research methodology, social work research, thematic analysis
Procedia PDF Downloads 21221647 LaPEA: Language for Preprocessing of Edge Applications in Smart Factory
Authors: Masaki Sakai, Tsuyoshi Nakajima, Kazuya Takahashi
Abstract:
In order to improve the productivity of a factory, it is often the case to create an inference model by collecting and analyzing operational data off-line and then to develop an edge application (EAP) that evaluates the quality of the products or diagnoses machine faults in real-time. To accelerate this development cycle, an edge application framework for the smart factory is proposed, which enables to create and modify EAPs based on prepared inference models. In the framework, the preprocessing component is the key part to make it work. This paper proposes a language for preprocessing of edge applications, called LaPEA, which can flexibly process several sensor data from machines into explanatory variables for an inference model, and proves that it meets the requirements for the preprocessing.Keywords: edge application framework, edgecross, preprocessing language, smart factory
Procedia PDF Downloads 14621646 Structural Element Vibration Analysis with finite element method: Use of Rayleigh Quotient
Authors: Houari Boumediene University of Science, Technology.
Abstract:
"Various methods are typically used in the dynamic analysis of transversely vibrating beams. To achieve this, numerical methods are used to solve the general eigenvalue problem. The equations of equilibrium, which describe the motion, are derived from a fourth-order differential equation. Our study is based on the finite element method, and the results of the investigation are the vibration frequencies obtained using the Jacobi method. Two types of elementary mass matrices are considered: one representing a uniform distribution of mass along the element and the other consisting of concentrated masses located at fixed points whose number increases progressively with equal distances at each evaluation stage. The beams studied have different boundary constraints, representing several classical situations. Comparisons are made for beams where the distributed mass is replaced by n concentrated masses. As expected, the first calculation stage involves determining the lowest number of beam parts that gives a frequency comparable to that obtained from the Rayleigh formula. The obtained values are then compared to theoretical results based on the assumptions of the Bernoulli-Euler theory. These steps are repeated for the second type of mass representation in the same manner."Keywords: finite element method, bernouilli eulertheory, structural analysis, vibration analysis, rayleigh quotient
Procedia PDF Downloads 9221645 Key Factors for Stakeholder Engagement and Sustainable Development
Authors: Jo Rhodes, Bruce Bergstrom, Peter Lok, Vincent Cheng
Abstract:
The aim of this study is to determine key factors and processes for multinationals (MNCs) to develop an effective stakeholder engagement and sustainable development framework. A qualitative multiple-case approach was used. A triangulation method was adopted (interviews, archival documents and observations) to collect data on three global firms (MNCs). 9 senior executives were interviewed for this study (3 from each firm). An initial literature review was conducted to explore possible practices and factors (the deductive approach) to sustainable development. Interview data were analysed using Nvivo to obtain appropriate nodes and themes for the framework. A comparison of findings from interview data and themes, factors developed from the literature review and cross cases comparison were used to develop the final conceptual framework (the inductive approach). The results suggested that stakeholder engagement is a key mediator between ‘stakeholder network’ (internal and external factors) and outcomes (corporate social responsibility, social capital, shared value and sustainable development). Key internal factors such as human capital/talent, technology, culture, leadership and processes such as collaboration, knowledge sharing and co-creation of value with stakeholders were identified. These internal factors and processes must be integrated and aligned with external factors such as social, political, cultural, environment and NGOs to achieve effective stakeholder engagement.Keywords: stakeholder, engagement, sustainable development, shared value, corporate social responsibility
Procedia PDF Downloads 51321644 Natural Factors of Interannual Variability of Winter Precipitation over the Altai Krai
Authors: Sukovatov K.Yu., Bezuglova N.N.
Abstract:
Winter precipitation variability over the Altai Krai was investigated by retrieving temporal patterns. The spectral singular analysis was used to describe the variance distribution and to reduce the precipitation data into a few components (modes). The associated time series were related to large-scale atmospheric and oceanic circulation indices by using lag cross-correlation and wavelet-coherence analysis. GPCC monthly precipitation data for rectangular field limited by 50-550N, 77-880E and monthly climatological circulation index data for the cold season were used to perform SSA decomposition and retrieve statistics for analyzed parameters on the time period 1951-2017. Interannual variability of winter precipitation over the Altai Krai are mostly caused by three natural factors: intensity variations of momentum exchange between mid and polar latitudes over the North Atlantic (explained variance 11.4%); wind speed variations in equatorial stratosphere (quasi-biennial oscillation, explained variance 15.3%); and surface temperature variations for equatorial Pacific sea (ENSO, explained variance 2.8%). It is concluded that under the current climate conditions (Arctic amplification and increasing frequency of meridional processes in mid-latitudes) the second and the third factors are giving more significant contribution into explained variance of interannual variability for cold season atmospheric precipitation over the Altai Krai than the first factor.Keywords: interannual variability, winter precipitation, Altai Krai, wavelet-coherence
Procedia PDF Downloads 18821643 Determination of Non-CO2 Greenhouse Gas Emission in Electronics Industry
Authors: Bong Jae Lee, Jeong Il Lee, Hyo Su Kim
Abstract:
Both developed and developing countries have adopted the decision to join the Paris agreement to reduce greenhouse gas (GHG) emissions at the Conference of the Parties (COP) 21 meeting in Paris. As a result, the developed and developing countries have to submit the Intended Nationally Determined Contributions (INDC) by 2020, and each country will be assessed for their performance in reducing GHG. After that, they shall propose a reduction target which is higher than the previous target every five years. Therefore, an accurate method for calculating greenhouse gas emissions is essential to be presented as a rational for implementing GHG reduction measures based on the reduction targets. Non-CO2 GHGs (CF4, NF3, N2O, SF6 and so on) are being widely used in fabrication process of semiconductor manufacturing, and etching/deposition process of display manufacturing process. The Global Warming Potential (GWP) value of Non-CO2 is much higher than CO2, which means it will have greater effect on a global warming than CO2. Therefore, GHG calculation methods of the electronics industry are provided by Intergovernmental Panel on climate change (IPCC) and U.S. Environmental Protection Agency (EPA), and it will be discussed at ISO/TC 146 meeting. As discussed earlier, being precise and accurate in calculating Non-CO2 GHG is becoming more important. Thus this study aims to discuss the implications of the calculating methods through comparing the methods of IPCC and EPA. As a conclusion, after analyzing the methods of IPCC & EPA, the method of EPA is more detailed and it also provides the calculation for N2O. In case of the default emission factor (by IPCC & EPA), IPCC provides more conservative results compared to that of EPA; The factor of IPCC was developed for calculating a national GHG emission, while the factor of EPA was specifically developed for the U.S. which means it must have been developed to address the environmental issue of the US. The semiconductor factory ‘A’ measured F gas according to the EPA Destruction and Removal Efficiency (DRE) protocol and estimated their own DRE, and it was observed that their emission factor shows higher DRE compared to default DRE factor of IPCC and EPA Therefore, each country can improve their GHG emission calculation by developing its own emission factor (if possible) at the time of reporting Nationally Determined Contributions (NDC). Acknowledgements: This work was supported by the Korea Evaluation Institute of Industrial Technology (No. 10053589).Keywords: non-CO2 GHG, GHG emission, electronics industry, measuring method
Procedia PDF Downloads 28921642 The Interoperability between CNC Machine Tools and Robot Handling Systems Based on an Object-Oriented Framework
Authors: Pouyan Jahanbin, Mahmoud Houshmand, Omid Fatahi Valilai
Abstract:
A flexible manufacturing system (FMS) is a manufacturing system having the capability of handling the variations of products features that is the result of ever-changing customer demands. The flexibility of the manufacturing systems help to utilize the resources in a more effective manner. However, the control of such systems would be complicated and challenging. FMS needs CNC machines and robots and other resources for establishing the flexibility and enhancing the efficiency of the whole system. Also it needs to integrate the resources to reach required efficiency and flexibility. In order to reach this goal, an integrator framework is proposed in which the machining data of CNC machine tools is received through a STEP-NC file. The interoperability of the system is achieved by the information system. This paper proposes an information system that its data model is designed based on object oriented approach and is implemented through a knowledge-based system. The framework is connected to a database which is filled with robot’s control commands. The framework programs the robots by rules embedded in its knowledge based system. It also controls the interactions of CNC machine tools for loading and unloading actions by robot. As a result, the proposed framework improves the integration of manufacturing resources in Flexible Manufacturing Systems.Keywords: CNC machine tools, industrial robots, knowledge-based systems, manufacturing recourses integration, flexible manufacturing system (FMS), object-oriented data model
Procedia PDF Downloads 45521641 The Main Characteristics of Destructive Motivation
Authors: Elen Gasparyan, Naira Hakobyan
Abstract:
One of the leading factors determining the effectiveness of work in a modern organization is the motivation of its employees. In the scientific psychological literature, this phenomenon is understood mainly as constructive forms of motivation and the search for ways to increase it. At the same time, the motivation of employees can sometimes lead to a decrease in the productivity of the organization, i.e., destructive motivation is usually not considered from the point of view of various motivational theories. This article provides an analysis of various forms of destructive motivation of employees. These forms include formalism in labor behavior, inadequate assessment of the work done, and an imbalance of personal and organizational interests. The destructive motivation of personnel has certain negative consequences both for the employees themselves and for the entire organization - it leads to a decrease in the rate of production and the quality of products or services, increased conflict in the behavior of employees, etc. Currently, there is an increase in scientific interest in the study of destructive motivation. The subject of psychological research is not only modern socio-psychological processes but also the achievements of scientific thought in the field of theories of motivation and management. This article examines the theoretical approaches of J. S. Adams and Porter-Lawler, provides an analysis of theoretical concepts, and emphasizes the main characteristics of the destructiveness of motivation. Destructive work motivation is presented at the macro, meso, and micro levels. These levels express various directions of development of motivation stimuli, such as social, organizational, and personal ones. At the macro level, the most important characteristics of destructive motivation are the high-income gap between employers and employees, а high degree of unemployment, weak social protection of workers, non-compliance by employers with labor legislation, and emergencies. At the organizational level, the main characteristics are decreasing the diversity of work and insufficient work conditions. At the personal level, the main characteristic of destructive motivation is a discrepancy between personal and organizational interests. A comparative analysis of the theoretical and methodological foundations of the study of motivation makes it possible to identify not only the main characteristics of destructive motivation but also to determine the contours of psychological counseling to reduce destructiveness in the behavior of employees.Keywords: destructive, motivation, organization, behavior
Procedia PDF Downloads 4421640 NanoCelle®: A Nano Delivery Platform to Enhance Medicine
Authors: Sean Hall
Abstract:
Nanosystems for drug delivery are not new; as medicines evolve, so too does the desire to deliver a more targeted, patient-compliant medicine. Though, historically the widespread use of nanosystems for drug delivery has been fouled by non-replicability, scalability, toxicity issues, and economics. Examples include steps of manufacture and thus cost to manufacture, toxicity for nanoparticle scaffolding, autoimmune response, and considerable technical expertise for small non-commercial yields. This, unfortunately, demonstrates the not-so-obvious chasm between science and drug formulation for regulatory approval. Regardless there is a general and global desire to improve the delivery of medicines, reduce potential side effect profiles, promote increased patient compliance, and increase and/or speed public access to medicine availability. In this paper, the author will discuss NanoCelle®, a nano-delivery platform that specifically addresses degradation and solubility issues that expands from fundamental micellar preparations. NanoCelle® has been deployed in several Australian listed medicines and is in use of several drug candidates across small molecules, with research endeavors now extending into large molecules. The author will discuss several research initiatives as they relate to NanoCelle® to demonstrate similarities seen in various drug substances; these examples will include both in vitro and in vivo work.Keywords: NanoCelle®, micellar, degradation, solubility, toxicity
Procedia PDF Downloads 18021639 Nanoparticles of Hyaluronic Acid for Radiation Induced Lung Damages
Authors: Anna Lierova, Jitka Kasparova, Marcela Jelicova, Lucie Korecka, Zuzana Bilkova, Zuzana Sinkorova
Abstract:
Hyaluronic acid (HA) is a simple linear, unbranched polysaccharide with a lot of exceptional physiological and chemical properties such as high biocompatibility and biodegradability, strong hydration and viscoelasticity that depend on the size of the molecule. It plays the important role in a variety of molecular events as tissue hydration, mechanical protection of tissues and as well as during inflammation, leukocyte migration, and extracellular matrix remodeling. Also, HA-based biomaterials, including HA scaffolds, hydrogels, thin membranes, matrix grafts or nanoparticles are widely use in various biomedical applications. Our goal is to determine the radioprotective effect of hyaluronic acid nanoparticles (HA NPs). We are investigating effect of ionizing radiation on stability of HA NPs, in vitro relative toxicity of nanoscale as well as effect on cell lines and specific surface receptors and their response to ionizing radiation. An exposure to ionizing radiation (IR) can irreversibly damage various cell types and may thus have implications for the level of the whole tissue. Characteristic manifestations are formation of over-granulated tissue, remodeling of extracellular matrix (ECM) and abortive wound healing. Damages are caused by either direct interaction with DNA and IR proteins or indirectly by radicals formed during radiolysis of water Accumulation and turnover of ECM are a hallmark of radiation induces lung injury, characterized by inflammation, repair or remodeling health pulmonary tissue. HA is a major component of ECM in lung and plays an important role in regulating tissue injury, accelerating tissue repair, and controlling disease outcomes. Due to that, HA NPs were applied to in vivo model (C57Bl/6J mice) before total body or partial thorax irradiation. This part of our research is targeting on effect of exogenous HA on the development and/or mitigating acute radiation syndrome and radiation induced lung injuries.Keywords: hyaluronic acid, ionizing radiation, nanoparticles, radiation induces lung damages
Procedia PDF Downloads 16721638 Corporate Governance, Performance, and Financial Reporting Quality of Listed Manufacturing Firms in Nigeria
Authors: Jamila Garba Audu, Shehu Usman Hassan
Abstract:
The widespread failure in the financial information quality has created the need to improve the financial information quality and to strengthen the control of managers by setting up good firms structures. Published accounting information in financial statements is required to provide various users - shareholders, employees, suppliers, creditors, financial analysts, stockbrokers and government agencies – with timely and reliable information useful for making prudent, effective and efficient decisions. The relationship between corporate governance and performance to financial reporting quality is imperative; this is because despite rapid researches in this area the findings obtained from these studies are constantly inconclusive. Data for the study were extracted from the firms’ annual reports and accounts. After running the OLS regression, a robustness test was conducted for the validity of statistical inferences; the data was empirically tested. A multiple regression was employed to test the model as a technique for data analysis. The results from the analysis revealed a negative association between all the regressors and financial reporting quality except the performance of listed manufacturing firms in Nigeria. This indicates that corporate governance plays a significant role in mitigating earnings management and improving financial reporting quality while performance does not. The study recommended among others that the composition of audit committee should be made in accordance with the provision for code of corporate governance which is not more than six (6) members with at least one (1) financial expert.Keywords: corporate governance, financial reporting quality, manufacturing firms, Nigeria, performance
Procedia PDF Downloads 24621637 A Text Classification Approach Based on Natural Language Processing and Machine Learning Techniques
Authors: Rim Messaoudi, Nogaye-Gueye Gning, François Azelart
Abstract:
Automatic text classification applies mostly natural language processing (NLP) and other AI-guided techniques to automatically classify text in a faster and more accurate manner. This paper discusses the subject of using predictive maintenance to manage incident tickets inside the sociality. It focuses on proposing a tool that treats and analyses comments and notes written by administrators after resolving an incident ticket. The goal here is to increase the quality of these comments. Additionally, this tool is based on NLP and machine learning techniques to realize the textual analytics of the extracted data. This approach was tested using real data taken from the French National Railways (SNCF) company and was given a high-quality result.Keywords: machine learning, text classification, NLP techniques, semantic representation
Procedia PDF Downloads 10021636 Results of Twenty Years of Laparoscopic Hernia Repair Surgeries
Authors: Arun Prasad
Abstract:
Introduction: Laparoscopic surgery of hernia started in early 1990 and has had a mixed acceptance across the world, unlike laparoscopic cholecystectomy that has become a gold standard. Laparoscopic hernia repair claims to have less pain, less recurrence, and less wound infection compared to open hernia repair leading to early recovery and return to work. Materials and Methods: Laparoscopic hernia repair has been done in 2100 patients from 1995 till now with a follow-up data of 1350 patients. Data was analysed for results and satisfaction. Results: There is a recurrence rate of 0.1%. Early complications include bleeding, trocar injury and nerve pain. Late complications were rare. Conclusion: Laparoscopic inguinal hernia repair has a steep learning curve but after that the results and patient satisfaction are very good. It should be the procedure of choice in all bilateral and recurrent hernias.Keywords: laparoscopy, hernia, mesh, surgery
Procedia PDF Downloads 25321635 Dynamics of Mach Zehnder Modulator in Open and Closed Loop Bias Condition
Authors: Ramonika Sengupta, Stuti Kachhwaha, Asha Adhiya, K. Satya Raja Sekhar, Rajwinder Kaur
Abstract:
Numerous efforts have been done in the past decade to develop the methods of secure communication that are free from interception and eavesdropping. In fiber optic communication, chaotic optical carrier signals are used for data encryption in secure data transmission. Mach-Zehnder Modulators (MZM) are the key components for generating the chaotic signals to be used as optical carriers. This paper presents the dynamics of a lithium niobate MZM modulator under various biasing conditions. The chaotic fluctuations of the intensity of a laser diode have been generated using the electro-optic MZM modulator operating in a highly nonlinear regime. The modulator is driven in closed loop by its own output at an earlier time. When used as an electro-optic oscillator employing delayed feedback, the MZM displays a wide range of output waveforms of varying complexity. The dynamical behavior of the system ranges from periodic to nonlinear oscillations. The nonlinearity displayed by the system is reproducible and is easily controllable. In this paper, we demonstrate a wide variety of optical signals generated by MZM using easily controllable device parameters in both open and close loop bias conditions.Keywords: chaotic carrier, fiber optic communication, Mach-Zehnder modulator, secure data transmission
Procedia PDF Downloads 27221634 Use of Silicate or Chicken Compost in Calacarious Soil on Productivity and Mineral Status of Wheat Plants under Different Levels of Phosphorus
Authors: Hanan, S. Siam, Safaa A. Mahmoud, A. S. Taalab
Abstract:
A pot experiment was conducted in greenhouse of NRC, Dokki, Cairo, Egypt to study the response of wheat plants to different levels of superphosphate at (60kg P2O5 or 30 kg P2O5) with or without potassium silicate or chicken compost (2.5 ton/fed.) on growth yield and nutrients status especially, and phosphorus and silica availability. Data reveal that the addition either chicken or compost increased significantly affected on all the growth and yield parameters as well as nutrients status and protein of the different parts of wheat plants if compared with control (60kg P2O5 or 30 kg P2O5). Data also reveal that the highest mean values were obtained when potassium silicate with was added to 60 kg P2O5, while the lowest values of the previous parameters were obtained when 30 kg P2O5 alone was added to plants. Furthermore, data indicated that the highest mean values of all mentioned parameters were obtained when chicken compost was applied with any rate of P as compared with silica addition at the same rates of P. According to the results, the highest values of all mentioned parameters were obtained when addition of chicken compost and potassium silicate including the high rate of P at (60 kg P2O5) while the lowest values of the previous parameters were obtained when plants received of phosphorus (30 kg P2O5) alone.Keywords: wheat, yield, chicken compost, potassium, phosphorus, silicate, nutrients status
Procedia PDF Downloads 27521633 Branding Tourism Destinations; The Trending Initiatives for Edifice Image Choices of Foreign Policy
Authors: Mehtab Alam, Mudiarasan Kuppusamy, Puvaneswaran Kunaserkaran
Abstract:
The purpose of this paper is to bridge the gap and complete the relationship between tourism destinations and image branding as a choice of edifice foreign policy. Such options became a crucial component for individuals interested in leisure and travel activities. The destination management factors have been evaluated and analyzed using the primary and secondary data in a mixed-methods approach (quantitative sample of 384 and qualitative 8 semi-structured interviews at saturated point). The study chose the Environmental Management Accounting (EMA) and Image Restoration (IR) theories, along with a schematic diagram and an analytical framework supported by NVivo software 12, for two locations in Abbottabad, KPK, Pakistan: Shimla Hill and Thandiani. This incorporates the use of PLS-SEM model for assessing validity of data while SPSS for data screening of descriptive statistics. The results show that destination management's promotion of tourism has significantly improved Pakistan's state image. The use of institutional setup, environmental drivers, immigration, security, and hospitality as well as recreational initiatives on destination management is encouraged. The practical ramifications direct the heads of tourism projects, diplomats, directors, and policymakers to complete destination projects before inviting people to Pakistan. The paper provides the extent of knowledge for academic tourism circles to use tourism destinations as brand ambassadors.Keywords: tourism, management, state image, foreign policy, image branding
Procedia PDF Downloads 6921632 Liquid-Liquid Equilibrium Study in Solvent Extraction of o-Cresol from Coal Tar
Authors: Dewi Selvia Fardhyanti, Astrilia Damayanti
Abstract:
Coal tar is a liquid by-product of the process of coal gasification and carbonation, also in some industries such as steel, power plant, cement, and others. This liquid oil mixture contains various kinds of useful compounds such as aromatic compounds and phenolic compounds. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research investigates thermodynamic modelling of liquid-liquid equilibria (LLE) in solvent extraction of o-Cresol from the coal tar. The equilibria are modeled by ternary components of Wohl, Van Laar, and Three-Suffix Margules models. The values of the parameters involved are obtained by curve-fitting to the experimental data. Based on the comparison between calculated and experimental data, it turns out that among the three models studied, the Three-Suffix Margules seems to be the best to predict the LLE of o-Cresol for those system.Keywords: coal tar, o-Cresol, Wohl, Van Laar, three-suffix margules
Procedia PDF Downloads 27721631 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker
Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation
Procedia PDF Downloads 2321630 Undersea Communications Infrastructure: Risks, Opportunities, and Geopolitical Considerations
Authors: Lori W. Gordon, Karen A. Jones
Abstract:
Today’s high-speed data connectivity depends on a vast global network of infrastructure across space, air, land, and sea, with undersea cable infrastructure (UCI) serving as the primary means for intercontinental and ‘long-haul’ communications. The UCI landscape is changing and includes an increasing variety of state actors, such as the growing economies of Brazil, Russia, India, China, and South Africa. Non-state commercial actors, such as hyper-scale content providers including Google, Facebook, Microsoft, and Amazon, are also seeking to control their data and networks through significant investments in submarine cables. Active investments by both state and non-state actors will invariably influence the growth, geopolitics, and security of this sector. Beyond these hyper-scale content providers, there are new commercial satellite communication providers. These new players include traditional geosynchronous (GEO) satellites that offer broad coverage, high throughput GEO satellites offering high capacity with spot beam technology, low earth orbit (LEO) ‘mega constellations’ – global broadband services. And potential new entrants such as High Altitude Platforms (HAPS) offer low latency connectivity, LEO constellations offer high-speed optical mesh networks, i.e., ‘fiber in the sky.’ This paper focuses on understanding the role of submarine cables within the larger context of the global data commons, spanning space, terrestrial, air, and sea networks, including an analysis of national security policy and geopolitical implications. As network operators and commercial and government stakeholders plan for emerging technologies and architectures, hedging risks for future connectivity will ensure that our data backbone will be secure for years to come.Keywords: communications, global, infrastructure, technology
Procedia PDF Downloads 8721629 “Divorced Women are Like Second-Hand Clothes” - Hate Language in Media Discourse (Using the Example of Electronic Media Platforms)
Authors: Sopio Totibadze
Abstract:
Although the legal framework of Georgia reflects the main principles of gender equality and is in line with the international situation (UNDP, 2018), Georgia remains a male-dominated society. This means that men prevail in many areas of social, economic, and political life, which frequently gives women a subordinate status in society and the family (UN women). According to the latest study, “violence against women and girls in Georgia is also recognized as a public problem, and it is necessary to focus on it” (UN women). Moreover, the Public Defender's report on the protection of human rights in Georgia (2019) reveals that “in the last five years, 151 women were killed in Georgia due to gender and family violence”. Sadly, these statistics have increased significantly since that time. The issue was acutely reflected in the document published by the Organization for Security and Cooperation in Europe, “Gender Hate Crime” (March 10, 2021). “Unfortunately, the rates of femicide ..... are still high in the country, and distrust of law enforcement agencies often makes such cases invisible, which requires special attention from the state.” More precisely, the cited document considers that there are frequent cases of crimes based on gender-based oppression in Georgia, which pose a threat not only to women but also to people of any gender whose desires and aspirations do not correspond to the gender norms and roles prevailing in society. According to the study, this type of crime has a “significant and lasting impact on the victim(s) and also undermines the safety and cohesion of society and gender equality”. It is well-known that language is often used as a tool for gender oppression (Rusieshvili-Cartledge and Dolidze, 2021; Totibadze, 2021). Therefore, feminist and gender studies in linguistics ultimately serve to represent the problem, reflect on it, and propose ways to solve it. Together with technical advancement in communication, a new form of discrimination has arisen- hate language against women in electronic media discourse. Due to the nature of social media and the internet, messages containing hate language can spread in seconds and reach millions of people. However, only a few know about the detrimental effects they may have on the addressee and society. This paper aims to analyse the hateful comments directed at women on various media platforms to determine (1) the linguistic strategies used while attacking women and (2) the reasons why women may fall victim to this type of hate language. The data have been collected over six months, and overall, 500 comments will be examined for the paper. Qualitative and quantitative analysis was chosen for the methodology of the study. The comments posted on various media platforms, including social media posts, articles, or pictures, have been selected manually due to several reasons, the most important being the problem of identifying hate speech as it can disguise itself in different ways- humour, memes, etc. The comments on the articles, posts, pictures, and videos selected for sociolinguistic analysis depict a woman, a taboo topic, or a scandalous event centred on a woman that triggered a lot of hatred and hate language towards the person to whom the post/article was dedicated. The study has revealed that a woman can become a victim of hatred directed at them if they do something considered to be a deviation from a societal norm, namely, get a divorce, be sexually active, be vocal about feministic values, and talk about taboos. Interestingly, people who utilize hate language are not only men trying to “normalize” the prejudiced patriarchal values but also women who are equally active in bringing down a "strong" woman. The paper also aims to raise awareness about the hate language directed at women, as being knowledgeable about the issue at hand is the first step to tackling it.Keywords: femicide, hate language, media discourse, sociolinguistics
Procedia PDF Downloads 8321628 Cost Benefit Analysis and Adjustments of Corporate Social Responsibility in the Airline Industry
Authors: Roman Asatryan
Abstract:
The decision-making processes in Corporate Social Responsibility (CSR) among firms in general and airlines in particular have to do with the benefits that accrue through those investments. The crux of the matter is not whether to invest in CSR or not, but rather, how firms can quantify the benefits derived from such investments. This paper analyzes the cost benefit adjustment strategies for firms in the airline industry in their CSR strategy adoption and implementation. The adjustment strategies identified will enable firms in the airline industry to have a basis for determining the worth of such CSR investments. This paper discusses the cost and benefit analysis model in order to understand the ways airlines can reduce costs and increase returns on CSR, or balance the cost and benefits. The analysis from this study points to the fact that economic concepts especially the CBA are useful, though they are not without challenges. The challenge arises when it is problematic to express the real impact of the externality in monetary terms. The use of rational maximization of the gains may seem to be a rather optimistic goal mainly because of environmental variability, perceptual uncertainty, and imperfect knowledge about the potential externality. This paper concludes that the CBA model gives a basic understanding of the motivations for investing in intangible assets like CSR. Consequently, it sets the tone for formulating relevant hypothesis in empirical studies in investment in CSR in particular and other intangible assets in business operations.Keywords: cost-benefit analysis, corporate social responsibility, airline industry
Procedia PDF Downloads 39421627 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques
Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev
Abstract:
Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.Keywords: data analysis, demand modeling, healthcare, medical facilities
Procedia PDF Downloads 14421626 Electronic Six-Minute Walk Test (E-6MWT): Less Manpower, Higher Efficiency, and Better Data Management
Authors: C. M. Choi, H. C. Tsang, W. K. Fong, Y. K. Cheng, T. K. Chui, L. Y. Chan, K. W. Lee, C. K. Yuen, P. W. Lau, Y. L. To, K. C. Chow
Abstract:
Six-minute walk test (6MWT) is a sub-maximal exercise test to assess aerobic capacity and exercise tolerance of patients with chronic respiratory disease and heart failure. This has been proven to be a reliable and valid tool and commonly used in clinical situations. Traditional 6MWT is labour-intensive and time-consuming especially for patients who require assistance in ambulation and oxygen use. When performing the test with these patients, one staff will assist the patient in walking (with or without aids) while another staff will need to manually record patient’s oxygen saturation, heart rate and walking distance at every minute and/or carry oxygen cylinder at the same time. Physiotherapist will then have to document the test results in bed notes in details. With the use of electronic 6MWT (E-6MWT), patients wear a wireless oximeter that transfers data to a tablet PC via Bluetooth. Real-time recording of oxygen saturation, heart rate, and distance are displayed. No manual work on recording is needed. The tablet will generate a comprehensive report which can be directly attached to the patient’s bed notes for documentation. Data can also be saved for later patient follow up. This study was carried out in North District Hospital. Patients who followed commands and required 6MWT assessment were included. Patients were assigned to study or control groups. In the study group, patients adopted the E-6MWT while those in control group adopted the traditional 6MWT. Manpower and time consumed were recorded. Physiotherapists also completed a questionnaire about the use of E-6MWT. Total 12 subjects (Study=6; Control=6) were recruited during 11-12/2017. An average number of staff required and time consumed in traditional 6MWT were 1.67 and 949.33 seconds respectively; while in E-6MWT, the figures were 1.00 and 630.00 seconds respectively. Compared to traditional 6MWT, E-6MWT required 67.00% less manpower and 50.10% less in time spent. Physiotherapists (n=7) found E-6MWT is convenient to use (mean=5.14; satisfied to very satisfied), requires less manpower and time to complete the test (mean=4.71; rather satisfied to satisfied), has better data management (mean=5.86; satisfied to very satisfied) and is recommended to be used clinically (mean=5.29; satisfied to very satisfied). It is proven that E-6MWT requires less manpower input with higher efficiency and better data management. It is welcomed by the clinical frontline staff.Keywords: electronic, physiotherapy, six-minute walk test, 6MWT
Procedia PDF Downloads 154