Search results for: lean tools
745 Marzuq Basin Palaeozoic Petroleum System
Authors: M. Dieb, T. Hodairi
Abstract:
In the Southwest Libya area, the Palaeozoic deposits are an important petroleum system, with Silurian shale considered a hydrocarbon source rock and Cambro-Ordovician recognized as a good reservoir. The Palaeozoic petroleum system has the greatest potential for conventional and is thought to represent the significant prospect of unconventional petroleum resources in Southwest Libya. Until now, the lateral and vertical heterogeneity of the source rock was not well evaluated, and oil-source correlation is still a matter of debate. One source rock, which is considered the main source potential in Marzuq Basin, was investigated for its uranium contents using gamma-ray logs, rock-eval pyrolysis, and organic petrography for their bulk kinetic characteristics to determine the petroleum potential qualitatively and quantitatively. Thirty source rock samples and fifteen oil samples from the Tannezzuft source rock were analyzed by Rock-Eval Pyrolysis, microscopely investigation, GC, and GC-MS to detect acyclic isoprenoids and aliphatic, aromatic, and NSO biomarkers. Geochemistry tools were applied to screen source and age-significant biomarkers to high-spot genetic relationships. A grating heterogeneity exists among source rock zones from different levels of depth with varying uranium contents according to gamma-ray logs, rock-eval pyrolysis results, and kinetic features. The uranium-rich Tannezzuft Formations (Hot Shales) produce oils and oil-to-gas hydrocarbons based on their richness, kerogen type, and thermal maturity. Biomarker results such as C₂₇, C₂₈, and C₂₉ steranes concentrations and C₂₄ tetracyclic terpane/C₂₉ tricyclic terpane ratios, with sterane and hopane ratios, are considered the most promising biomarker information in differentiating within the Silurian Shale Tannezzuft Formation and in correlating with its expelled oils. The Tannezzuft Hot Shale is considered the main source rock for oil and gas accumulations in the Cambro-Ordovician reservoirs within the Marzuq Basin. Migration of the generated and expelled oil and gas from the Tannezzuft source rock to the reservoirs of the Cambro-Ordovician petroleum system was interpreted to have occurred along vertical and lateral pathways along the faults in the Palaeozoic Strata. The Upper Tannezzuft Formation (cold shale) is considered the primary seal in the Marzuq Basin.Keywords: heterogeneity, hot shale, kerogen, Silurian, uranium
Procedia PDF Downloads 64744 Recognising Patients’ Perspective on Health Behaviour Problems Through Laughter: Implications for Patient-Centered Care Practice in Behaviour Change Consultations in General Practice
Authors: Binh Thanh Ta, Elizabeth Sturgiss
Abstract:
Central to patient-centered care is the idea of treating a patient as a person and understanding their perspectives regarding their health conditions and care preferences. Surprisingly, little is known about how GPs can understand their patients’ perspectives. This paper addresses the challenge of understanding patient perspectives in behavior change consultations by adopting Conversation Analysis (CA), which is an empirical research approach that allows both researchers and the audience to examine patients’ perspectives as displayed in GP-patient interaction. To understand people’s perspectives, CA researchers do not rely on what they say but instead on how they demonstrate their endogenous orientations to social norms when they interact with each other. Underlying CA is the notion that social interaction is orderly by all means. (It is important to note that social orders should not be treated as exogenous sets of rules that predetermine human behaviors. Rather social orders are constructed and oriented by social members through their interactional practices. Also, note that these interactional practices are the resources shared by all social members). As CA offers tools to uncover the orderliness of interactional practices, it not only allows us to understand the perspective of a particular patient in a particular medical encounter but, more importantly, enables us to recognise the shared interactional practice for signifying a particular perspective. Drawing on the 10 video-recorded consultations on behavior change in primary care, we have discovered the orderliness of patient laughter when reporting health behaviors, which signifies their orientation to the problematic nature of the reported behaviors. Among 24 cases where patients reported their health behaviors, we found 19 cases in which they laughed while speaking. In the five cases where patients did not laugh, we found that they explicitly framed their behavior as unproblematic. This finding echoes the CA body research on laughter, which suggests that laughter produced by first speakers (as opposed to laughing in response to what has been said earlier) normally indicates some sort of problems oriented to the self (e.g. self-tease, self-depreciation, etc.). This finding points to the significance of understanding when and why patients laugh; such understanding would assist GPs to recognise whether patients treat their behavior as problematic or not, thereby producing responses sensitive to patient perspectives.Keywords: patient centered care, laughter, conversation analysis, primary care, behaviour change consultations
Procedia PDF Downloads 99743 Capitalizing 'Ba' in a Knowledge Creation among Medical Researchers in Malaysian Higher Education Institution
Authors: Connie Edang, Siti Arpah Noordin, Shamila Mohamed Shuhidan
Abstract:
For the past few decades, there are growing numbers of knowledge based industries in Malaysia. As competitive edge has become so important nowadays, the consideration of research and development (R&D) should be put at the highest priority. Alike other industries, HEIs are also contributors to the nation’s development and wealth. Hence, to become a hub for creating a knowledge-based society, HEIs not only responsible for producing skillful human capital, but also to get involved in R&D. With the importance of R&D in today’s modern economy and the rise of Science and Technology, it gives opportunities for researchers to explore this sector as to place Malaysia as a provider in some key strategic industries, including medical and health sciences field. Academic researchers/medical researchers possess unique tacit and skills based in accordance with their experience and professional expert areas. In completing a collaborative research work, there must be platforms to enable the conversion of their knowledge hence beneficial towards creation of new knowledge. The objectives of this study are to: i) explore the knowledge creation activities of medical researchers in the Malaysian Higher Education Institution (HEI); ii) explore the driving forces for knowledge creation activities among the researchers; and iii) explore the interpretation of medical researchers on the establishment of ‘ba’ in the creation of knowledge. Based on the SECI model was introduced by Nonaka and Takeuchi and the Japanese concept of ‘ba’, a qualitative study whereby semi structured interview was used as to gather the informants’ viewpoints and insights based on their experience capitalizing ‘ba’ to support their knowledge creation activities. A single the study was conducted at one of the HEIs located in Sabah. From this study, both face to face and the ICT-assisted tools are found to be significant to support interaction of their knowledge. ICT seems to ease their interaction with other research collaborator. However, this study revealed that interaction conducted in physical settings is still be best preferred by the medical researchers especially situations of whereby their knowledge is hard to be externalized. Moreover, it revealed that motivational factors play important roles as for driving forces affecting their knowledge creation activities. Other than that, the medical researchers addressed that the mix interaction bring forth value in terms of facilitating knowledge creation. Therefore this study would benefit the institution to highly optimize the utilization of good platform so that knowledge can be transferred and be made used by others in appropriate ways.Keywords: ‘ba’, knowledge creation dynamics, Malaysia, higher education institution, medical researchers
Procedia PDF Downloads 218742 Analysis of Constraints and Opportunities in Dairy Production in Botswana
Authors: Som Pal Baliyan
Abstract:
Dairy enterprise has been a major source of employment and income generation in most of the economies worldwide. Botswana government has also identified dairy as one of the agricultural sectors towards diversification of the mineral dependent economy of the country. The huge gap between local demand and supply of milk and milk products indicated that there are not only constraints but also; opportunities exist in this sub sector of agriculture. Therefore, this study was an attempt to identify constraints and opportunities in dairy production industry in Botswana. The possible ways to mitigate the constraints were also identified. The findings should assist the stakeholders especially, policy makers in the formulation of effective policies for the growth of dairy sector in the country. This quantitative study adopted a survey research design. A final survey followed by a pilot survey was conducted for data collection. The purpose of the pilot survey was to collect basic information on the nature and extent of the constraints, opportunities and ways to mitigate the constraints in dairy production. Based on the information from pilot survey, a four point Likert’s scale type questionnaire was constructed, validated and tested for its reliability. The data for the final survey were collected from purposively selected twenty five dairy farms. The descriptive statistical tools were employed to analyze data. Among the twelve constraints identified; high feed costs, feed shortage and availability, lack of technical support, lack of skilled manpower, high prevalence of pests and diseases and, lack of dairy related technologies were the six major constraints in dairy production. Grain feed production, roughage feed production, manufacturing of dairy feed, establishment of milk processing industry and, development of transportation systems were the five major opportunities among the eight opportunities identified. Increasing production of animal feed locally, increasing roughage feed production locally, provision of subsidy on animal feed, easy access to sufficient financial support, training of the farmers and, effective control of pests and diseases were identified as the six major ways to mitigate the constraints. It was recommended that the identified constraints and opportunities as well as the ways to mitigate the constraints need to be carefully considered by the stakeholders especially, policy makers during the formulation and implementation of the policies for the development of dairy sector in Botswana.Keywords: dairy enterprise, milk production, opportunities, production constraints
Procedia PDF Downloads 407741 Evaluating the Feasibility of Chemical Dermal Exposure Assessment Model
Authors: P. S. Hsi, Y. F. Wang, Y. F. Ho, P. C. Hung
Abstract:
The aim of the present study was to explore the dermal exposure assessment model of chemicals that have been developed abroad and to evaluate the feasibility of chemical dermal exposure assessment model for manufacturing industry in Taiwan. We conducted and analyzed six semi-quantitative risk management tools, including UK - Control of substances hazardous to health ( COSHH ) Europe – Risk assessment of occupational dermal exposure ( RISKOFDERM ), Netherlands - Dose related effect assessment model ( DREAM ), Netherlands – Stoffenmanager ( STOFFEN ), Nicaragua-Dermal exposure ranking method ( DERM ) and USA / Canada - Public Health Engineering Department ( PHED ). Five types of manufacturing industry were selected to evaluate. The Monte Carlo simulation was used to analyze the sensitivity of each factor, and the correlation between the assessment results of each semi-quantitative model and the exposure factors used in the model was analyzed to understand the important evaluation indicators of the dermal exposure assessment model. To assess the effectiveness of the semi-quantitative assessment models, this study also conduct quantitative dermal exposure results using prediction model and verify the correlation via Pearson's test. Results show that COSHH was unable to determine the strength of its decision factor because the results evaluated at all industries belong to the same risk level. In the DERM model, it can be found that the transmission process, the exposed area, and the clothing protection factor are all positively correlated. In the STOFFEN model, the fugitive, operation, near-field concentrations, the far-field concentration, and the operating time and frequency have a positive correlation. There is a positive correlation between skin exposure, work relative time, and working environment in the DREAM model. In the RISKOFDERM model, the actual exposure situation and exposure time have a positive correlation. We also found high correlation with the DERM and RISKOFDERM models, with coefficient coefficients of 0.92 and 0.93 (p<0.05), respectively. The STOFFEN and DREAM models have poor correlation, the coefficients are 0.24 and 0.29 (p>0.05), respectively. According to the results, both the DERM and RISKOFDERM models are suitable for performance in these selected manufacturing industries. However, considering the small sample size evaluated in this study, more categories of industries should be evaluated to reduce its uncertainty and enhance its applicability in the future.Keywords: dermal exposure, risk management, quantitative estimation, feasibility evaluation
Procedia PDF Downloads 170740 Ikat: Undaunted Journey of a Traditional Textile Practice, a Sublime Connect of Traditionality with Modernity and Calibration for Eco-Sustainable Options
Authors: Purva Khurana
Abstract:
Traditional textile crafts are universally found to have been significantly impeded by the uprise of innovative technologies, but sustained human endeavor, in sync with dynamic market nuances, holds key to these otherwise getting fast-extinct marvels. The metamorphosis of such art-forms into niche markets pre-supposes sharp concentration on adaptability. The author has concentrated on the ancient handicraft of Ikat in Andhra Pradesh (India), a manifestation of their cultural heritage and esoteric cottage industry, so very intrinsic to the development and support of local economy and identity. Like any other traditional practice, ikat weaving has been subjected to the challenges of modernization. However, owing to its unique character, personalize production and adaptability, both of material and process, ikat weaving has stood the test of time by way of judiciously embellishing innovation with contemporary taste. To survive as a living craft as also to justify its role as a universal language of aesthetic sensibility, it is imperative that ikat tradition should lend itself continuous process of experiments, change and growth. Besides, the instant paper aims to examine the contours of ikat production process from its pure form, to more fashion and market oriented production, with upgraded process, material and tools. Over the time, it has adapted well to new style-paradigms, duly matching up with the latest fashion trends, in tandem with the market-sensitivities. Apart, it is an effort to investigate how this craft could respond constructively to the pressure of contemporary technical developments in order to be at cutting edge, while preserving its integrity. In order to approach these issues, the methodology adopted is, conceptual analysis of the craft practices, its unique strength and how they could be used to advance the craft in relation to the emergence of technical developments. The paper summarizes the result of the study carried out by the author on the peculiar advantages of suitably- calibrated vat dyes over natural dyes, in terms of its recycling ability and eco-friendly properties, thus holding definite edge, both in terms of socio-economic as well as environmental concerns.Keywords: craft, eco-friendly dyes, ikat, metamorphosis
Procedia PDF Downloads 174739 Hierarchy and Weight of Influence Factors on Labor Productivity in the Construction Industry of the Nepal
Authors: Shraddha Palikhe, Sunkuk Kim
Abstract:
The construction industry is the most labor intensive in Nepal. It is obvious that construction is a major sector and any productivity enhancement activity in this sector will have a positive impact in the overall improvement of the national economy. Previous studies have stated that Nepal has poor labor productivity among other south Asian countries. Though considerable research has been done on productivity factors in other countries, no study has addressed labor productivity issues in Nepal. Therefore, the main objective of this study is to identify and hierarchy the influence factors for poor labor productivity. In this study, a questionnaire approach is chosen as a method of the survey from thirty experts involved in the construction industry, such as Architects, Civil Engineers, Project Engineers and Site Engineers. A survey was conducted in Nepal, to identify the major factors impacting construction labor productivity. Analytic Hierarchy Process (AHP) analysis method was used to understand the underlying relationships among the factors, categorized into five groups, namely (1) Labor-management group; (2) Material management group; (3) Human labor group; (4) Technological group and (5) External group and was divided into 33 subfactors. AHP was used to establish the relative importance of the criteria. The AHP makes pairwise comparisons of relative importance between hierarchy elements grouped by labor productivity decision criteria. Respondents were asked to answer based on their experience of construction works. On the basis of the respondent’s response, weight of all the factors were calculated and ranked it. The AHP results were tabulated based on weight and ranking of influence factors. AHP model consists of five main criteria and 33 sub-criteria. Among five main criteria, the scenario assigns a weight of highest influential factor i.e. 26.15% to human labor group followed by 23.01% to technological group, 22.97% to labor management group, 17.61% material management group and 10.25% to external group. While in 33 sub-criteria, the most influential factor for poor productivity in Nepal are lack of monetary incentive (20.53%) for human labor group, unsafe working condition (17.55%) for technological group, lack of leadership (18.43%) for labor management group, unavailability of tools at site (25.03%) for material management group and strikes (35.01%) for external group. The results show that AHP model associated criteria are helpful to predict the current situation of labor productivity. It is essential to consider these influence factors to improve the labor productivity in the construction industry of Nepal.Keywords: construction, hierarchical analysis, influence factors, labor productivity
Procedia PDF Downloads 405738 Laboratory Diagnostic Testing of Peste des Petits Ruminants in Georgia
Authors: Nino G. Vepkhvadze, Tea Enukidze
Abstract:
Every year the number of countries around the world face the risk of the spread of infectious diseases that bring significant ecological and social-economic damage. Hence, the importance of food product safety is emphasized that is the issue of interest for many countries. To solve them, it’s necessary to conduct preventive measures against the diseases, have accurate diagnostic results, leadership, and management. The Peste des petits ruminants (PPR) disease is caused by a morbillivirus closely related to the rinderpest virus. PPR is a transboundary disease as it emerges and evolves, considered as one of the top most damaging animal diseases. The disease imposed a serious threat to sheep-breeding when the farms of sheep, goats are significantly growing within the country. In January 2016, PPR was detected in Georgia. Up to present the origin of the virus, the age relationship of affected ruminants and the distribution of PPRV in Georgia remains unclear. Due to the nature of PPR, and breeding practices in the country, reemerging of the disease in Georgia is highly likely. The purpose of the studies is to provide laboratories with efficient tools allowing the early detection of PPR emergence and re-emergences. This study is being accomplished under the Biological Threat Reduction Program project with the support of the Defense Threat Reduction Agency (DTRA). The purpose of the studies is to investigate the samples and identify areas at high risk of the disease. Georgia has a high density of small ruminant herds bred as free-ranging, close to international borders. Kakheti region, Eastern Georgia, will be considered as area of high priority for PPR surveillance. For this reason, in 2019, in Kakheti region investigated n=484 sheep and goat serum and blood samples from the same animals, utilized serology and molecular biology methods. All samples were negative by RT-PCR, and n=6 sheep samples were seropositive by ELISA-Ab. Future efforts will be concentrated in areas where the risk of PPR might be high such as international bordering regions of Georgia. For diagnostics, it is important to integrate the PPRV knowledge with epidemiological data. Based on these diagnostics, the relevant agencies will be able to control the disease surveillance.Keywords: animal disease, especially dangerous pathogen, laboratory diagnostics, virus
Procedia PDF Downloads 116737 Advancing Circular Economy Principles: Integrating AI Technology in Street Sanitation for Sustainable Urban Development
Authors: Xukai Fu
Abstract:
The concept of circular economy is interdisciplinary, intersecting environmental engineering, information technology, business, and social science domains. Over the course of its 15-year tenure in the sanitation industry, Jinkai has concentrated its efforts in the past five years on integrating artificial intelligence (AI) technology with street sanitation apparatus and systems. This endeavor has led to the development of various innovations, including the Intelligent Identification Sweeper Truck (Intelligent Waste Recognition and Energy-saving Control System), the Intelligent Identification Water Truck (Intelligent Flushing Control System), the intelligent food waste treatment machine, and the Intelligent City Road Sanitation Surveillance Platform. This study will commence with an examination of prevalent global challenges, elucidating how Jinkai effectively addresses each within the framework of circular economy principles. Utilizing a review and analysis of pertinent environmental management data, we will elucidate Jinkai's strategic approach. Following this, we will investigate how Jinkai utilizes the advantages of circular economy principles to guide the design of street sanitation machinery, with a focus on digitalization integration. Moreover, we will scrutinize Jinkai's sustainable practices throughout the invention and operation phases of street sanitation machinery, aligning with the triple bottom line theory. Finally, we will delve into the significance and enduring impact of corporate social responsibility (CSR) and environmental, social, and governance (ESG) initiatives. Special emphasis will be placed on Jinkai's contributions to community stakeholders, with a particular emphasis on human rights. Despite the widespread adoption of circular economy principles across various industries, achieving a harmonious equilibrium between environmental justice and social justice remains a formidable task. Jinkai acknowledges that the mere development of energy-saving technologies is insufficient for authentic circular economy implementation; rather, they serve as instrumental tools. To earnestly promote and embody circular economy principles, companies must consistently prioritize the UN Sustainable Development Goals and adapt their technologies to address the evolving exigencies of our world.Keywords: circular economy, core principles, benefits, the tripple bottom line, CSR, ESG, social justice, human rights, Jinkai
Procedia PDF Downloads 50736 Evaluation of Information Technology Governance Frameworks for Better Governance in South Africa
Authors: Memory Ranga, Phillip Pretorious
Abstract:
The South African Government has invested a lot of money in Information Technology Governance (ITG) within the Government departments. The ITG framework was spearheaded by the Department of Public Service and Administration (DPSA). This led to the development of a governing ITG DPSA framework and later the Government Wide Enterprise Architecture (GWEA) Framework for assisting the departments to implement ITG. In addition to this, the government departments have adopted the Information Systems Audit and Control Association (ISACA) Control Objectives for Information and Related Technology (COBIT) for ITG processes. Despite all these available frameworks, departments fail to fully capitalise and improve the ITG processes mainly as these are too generic and difficult to apply for specific governance needs. There has been less research done to evaluate the progress on ITG initiatives within the government departments. This paper aims to evaluate the existing ITG frameworks within selected government departments in South Africa. A quantitative research approach was used in this study. Data was collected through an online questionnaire targeting ICT Managers and Directors from government departments. The study is undertaken within a case study and only the Eastern Cape Province was selected for the research. Document review mainly on ITG framework and best practices was also used. Data was analysed using the Google Analytic tools and SPSS. A one–sample Chi-Squared Test was used to verity the evaluation findings. Findings show that there is evidence that the current guiding National governance framework (DPSA) is out dated and does not accommodate the new changes in other governance frameworks. The Eastern Cape Government Departments have spent huge amount of money on ITG but not yet able to identify the benefits of the ITG initiatives. The guiding framework is rigid and does to address some of the departmental needs making it difficult to be flexible and apply the DPSA framework. Furthermore, despite the large budget on ITG, the departments still find themselves with many challenges and unable to improve some of the processes and services. All the engaged Eastern Cape departments have adopted the COBIT framework, but none has been conducting COBIT maturity Assessment which is a functionality of COBIT. There is evidence of too many the ITG frameworks and underutilisation of these frameworks. The study provides a comprehensive evaluation of the ITG frameworks that have been adopted by the South African Government Departments in the Eastern Cape Province. The evaluation guides and recommends the government departments to rethink and adopt ITG frameworks that could be customised to accommodate their needs. The adoption and application of ITG by government departments should assist in better governance and service delivery to the citizens.Keywords: information technology governance, COBIT, evaluate, framework, governance, DPSA framework
Procedia PDF Downloads 124735 Utilization of a Telepresence Evaluation Tool for the Implementation of a Distant Education Program
Authors: Theresa Bacon-Baguley, Martina Reinhold
Abstract:
Introduction: Evaluation and analysis are the cornerstones of any successful program in higher education. When developing a program at a distant campus, it is essential that the process of evaluation and analysis be orchestrated in a timely manner with tools that can identify both the positive and negative components of distant education. We describe the utilization of a newly developed tool used to evaluate and analyze the successful expansion to a distant campus using Telepresence Technology. Like interactive television, Telepresence allows live interactive delivery but utilizes broadband cable. The tool developed is adaptable to any distant campus as the framework for the tool was derived from a systematic review of the literature. Methodology: Because Telepresence is a relatively new delivery system, the evaluation tool was developed based on a systematic review of literature in the area of distant education and ITV. The literature review identified four potential areas of concern: 1) technology, 2) confidence in the system, 3) faculty delivery of the content and, 4) resources at each site. Each of the four areas included multiple sub-components. Benchmark values were determined to be 80% or greater positive responses to each of the four areas and the individual sub-components. The tool was administered each semester during the didactic phase of the curriculum. Results: Data obtained identified site-specific issues (i.e., technology access, student engagement, laboratory access, and resources), as well as issues common at both sites (i.e., projection screen size). More specifically, students at the parent location did not have adequate access to printers or laboratory space, and students at the distant campus did not have adequate access to library resources. The evaluation tool identified that both sites requested larger screens for visualization of the faculty. The deficiencies were addressed by replacing printers, including additional orientation for students on library resources and increasing the screen size of the Telepresence system. When analyzed over time, the issues identified in the tool as deficiencies were resolved. Conclusions: Utilizing the tool allowed adjustments of the Telepresence delivery system in a timely manner resulting in successful implementation of an entire curriculum at a distant campus.Keywords: physician assistant, telepresence technology, distant education, assessment
Procedia PDF Downloads 126734 Machine Translation Analysis of Chinese Dish Names
Authors: Xinyu Zhang, Olga Torres-Hostench
Abstract:
This article presents a comparative study evaluating and comparing the quality of machine translation (MT) output of Chinese gastronomy nomenclature. Chinese gastronomic culture is experiencing an increased international acknowledgment nowadays. The nomenclature of Chinese gastronomy not only reflects a specific aspect of culture, but it is related to other areas of society such as philosophy, traditional medicine, etc. Chinese dish names are composed of several types of cultural references, such as ingredients, colors, flavors, culinary techniques, cooking utensils, toponyms, anthroponyms, metaphors, historical tales, among others. These cultural references act as one of the biggest difficulties in translation, in which the use of translation techniques is usually required. Regarding the lack of Chinese food-related translation studies, especially in Chinese-Spanish translation, and the current massive use of MT, the quality of the MT output of Chinese dish names is questioned. Fifty Chinese dish names with different types of cultural components were selected in order to complete this study. First, all of these dish names were translated by three different MT tools (Google Translate, Baidu Translate and Bing Translator). Second, a questionnaire was designed and completed by 12 Chinese online users (Chinese graduates of a Hispanic Philology major) in order to find out user preferences regarding the collected MT output. Finally, human translation techniques were observed and analyzed to identify what translation techniques would be observed more often in the preferred MT proposals. The result reveals that the MT output of the Chinese gastronomy nomenclature is not of high quality. It would be recommended not to trust the MT in occasions like restaurant menus, TV culinary shows, etc. However, the MT output could be used as an aid for tourists to have a general idea of a dish (the main ingredients, for example). Literal translation turned out to be the most observed technique, followed by borrowing, generalization and adaptation, while amplification, particularization and transposition were infrequently observed. Possibly because that the MT engines at present are limited to relate equivalent terms and offer literal translations without taking into account the whole context meaning of the dish name, which is essential to the application of those less observed techniques. This could give insight into the post-editing of the Chinese dish name translation. By observing and analyzing translation techniques in the proposals of the machine translators, the post-editors could better decide which techniques to apply in each case so as to correct mistakes and improve the quality of the translation.Keywords: Chinese dish names, cultural references, machine translation, translation techniques
Procedia PDF Downloads 138733 Variable Mapping: From Bibliometrics to Implications
Authors: Przemysław Tomczyk, Dagmara Plata-Alf, Piotr Kwiatek
Abstract:
Literature review is indispensable in research. One of the key techniques used in it is bibliometric analysis, where one of the methods is science mapping. The classic approach that dominates today in this area consists of mapping areas, keywords, terms, authors, or citations. This approach is also used in relation to the review of literature in the field of marketing. The development of technology has resulted in the fact that researchers and practitioners use the capabilities of software available on the market for this purpose. The use of science mapping software tools (e.g., VOSviewer, SciMAT, Pajek) in recent publications involves the implementation of a literature review, and it is useful in areas with a relatively high number of publications. Despite this well-grounded science mapping approach having been applied in the literature reviews, performing them is a painstaking task, especially if authors would like to draw precise conclusions about the studied literature and uncover potential research gaps. The aim of this article is to identify to what extent a new approach to science mapping, variable mapping, takes advantage of the classic science mapping approach in terms of research problem formulation and content/thematic analysis for literature reviews. To perform the analysis, a set of 5 articles on customer ideation was chosen. Next, the analysis of key words mapping results in VOSviewer science mapping software was performed and compared with the variable map prepared manually on the same articles. Seven independent expert judges (management scientists on different levels of expertise) assessed the usability of both the stage of formulating, the research problem, and content/thematic analysis. The results show the advantage of variable mapping in the formulation of the research problem and thematic/content analysis. First, the ability to identify a research gap is clearly visible due to the transparent and comprehensive analysis of the relationships between the variables, not only keywords. Second, the analysis of relationships between variables enables the creation of a story with an indication of the directions of relationships between variables. Demonstrating the advantage of the new approach over the classic one may be a significant step towards developing a new approach to the synthesis of literature and its reviews. Variable mapping seems to allow scientists to build clear and effective models presenting the scientific achievements of a chosen research area in one simple map. Additionally, the development of the software enabling the automation of the variable mapping process on large data sets may be a breakthrough change in the field of conducting literature research.Keywords: bibliometrics, literature review, science mapping, variable mapping
Procedia PDF Downloads 122732 Automated, Objective Assessment of Pilot Performance in Simulated Environment
Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt
Abstract:
Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).Keywords: automated assessment, flight simulator, human factors, pilot training
Procedia PDF Downloads 150731 Medical Ethics in the Hospital: Towards Quality Ethics Consultation
Authors: Dina Siniora, Jasia Baig
Abstract:
During the past few decades, the healthcare system has undergone profound changes in their healthcare decision-making competencies and moral aptitudes due to the vast advancement in technology, clinical skills, and scientific knowledge. Healthcare decision-making deals with morally contentious dilemmas ranging from illness, life and death judgments that require sensitivity and awareness towards the patient’s preferences while taking into consideration medicine’s abilities and boundaries. As the ever-evolving field of medicine continues to become more scientifically and morally multifarious; physicians and the hospital administrators increasingly rely on ethics committees to resolve problems that arise in everyday patient care. The role and latitude of responsibilities of ethics committees which includes being dispute intermediaries, moral analysts, policy educators, counselors, advocates, and reviewers; suggest the importance and effectiveness of a fully integrated committee. Despite achievements on Integrated Ethics and progress in standards and competencies, there is an imminent necessity for further improvement in quality within ethics consultation services in areas of credentialing, professionalism and standards of quality, as well as the quality of healthcare throughout the system. These concerns can be resolved first by collecting data about particular quality gaps and comprehend the level to which ethics committees are consistent with newly published ASBH quality standards. Policymakers should pursue improvement strategies that target both academic bioethics community and major stakeholders at hospitals, who directly influence ethics committees. This broader approach oriented towards education and intervention outcome in conjunction with preventive ethics to address disparities in quality on a systematic level. Adopting tools for improving competencies and processes within ethics consultation by implementing a credentialing process, upholding normative significance for the ASBH core competencies, advocating for professional Code of Ethics, and further clarifying the internal structures will improve productivity, patient satisfaction, and institutional integrity. This cannot be systemically achieved without a written certification exam for HCEC practitioners, credentialing and privileging HCEC practitioners at the hospital level, and accrediting HCEC services at the institutional level.Keywords: ethics consultation, hospital, medical ethics, quality
Procedia PDF Downloads 190730 Digital Structural Monitoring Tools @ADaPT for Cracks Initiation and Growth due to Mechanical Damage Mechanism
Authors: Faizul Azly Abd Dzubir, Muhammad F. Othman
Abstract:
Conventional structural health monitoring approach for mechanical equipment uses inspection data from Non-Destructive Testing (NDT) during plant shut down window and fitness for service evaluation to estimate the integrity of the equipment that is prone to crack damage. Yet, this forecast is fraught with uncertainty because it is often based on assumptions of future operational parameters, and the prediction is not continuous or online. Advanced Diagnostic and Prognostic Technology (ADaPT) uses Acoustic Emission (AE) technology and a stochastic prognostic model to provide real-time monitoring and prediction of mechanical defects or cracks. The forecast can help the plant authority handle their cracked equipment before it ruptures, causing an unscheduled shutdown of the facility. The ADaPT employs process historical data trending, finite element analysis, fitness for service, and probabilistic statistical analysis to develop a prediction model for crack initiation and growth due to mechanical damage. The prediction model is combined with live equipment operating data for real-time prediction of the remaining life span owing to fracture. ADaPT was devised at a hot combined feed exchanger (HCFE) that had suffered creep crack damage. The ADaPT tool predicts the initiation of a crack at the top weldment area by April 2019. During the shutdown window in April 2019, a crack was discovered and repaired. Furthermore, ADaPT successfully advised the plant owner to run at full capacity and improve output by up to 7% by April 2019. ADaPT was also used on a coke drum that had extensive fatigue cracking. The initial cracks are declared safe with ADaPT, with remaining crack lifetimes extended another five (5) months, just in time for another planned facility downtime to execute repair. The prediction model, when combined with plant information data, allows plant operators to continuously monitor crack propagation caused by mechanical damage for improved maintenance planning and to avoid costly shutdowns to repair immediately.Keywords: mechanical damage, cracks, continuous monitoring tool, remaining life, acoustic emission, prognostic model
Procedia PDF Downloads 77729 Detection of Abnormal Process Behavior in Copper Solvent Extraction by Principal Component Analysis
Authors: Kirill Filianin, Satu-Pia Reinikainen, Tuomo Sainio
Abstract:
Frequent measurements of product steam quality create a data overload that becomes more and more difficult to handle. In the current study, plant history data with multiple variables was successfully treated by principal component analysis to detect abnormal process behavior, particularly, in copper solvent extraction. The multivariate model is based on the concentration levels of main process metals recorded by the industrial on-stream x-ray fluorescence analyzer. After mean-centering and normalization of concentration data set, two-dimensional multivariate model under principal component analysis algorithm was constructed. Normal operating conditions were defined through control limits that were assigned to squared score values on x-axis and to residual values on y-axis. 80 percent of the data set were taken as the training set and the multivariate model was tested with the remaining 20 percent of data. Model testing showed successful application of control limits to detect abnormal behavior of copper solvent extraction process as early warnings. Compared to the conventional techniques of analyzing one variable at a time, the proposed model allows to detect on-line a process failure using information from all process variables simultaneously. Complex industrial equipment combined with advanced mathematical tools may be used for on-line monitoring both of process streams’ composition and final product quality. Defining normal operating conditions of the process supports reliable decision making in a process control room. Thus, industrial x-ray fluorescence analyzers equipped with integrated data processing toolbox allows more flexibility in copper plant operation. The additional multivariate process control and monitoring procedures are recommended to apply separately for the major components and for the impurities. Principal component analysis may be utilized not only in control of major elements’ content in process streams, but also for continuous monitoring of plant feed. The proposed approach has a potential in on-line instrumentation providing fast, robust and cheap application with automation abilities.Keywords: abnormal process behavior, failure detection, principal component analysis, solvent extraction
Procedia PDF Downloads 310728 A Small-Scale Survey on Risk Factors of Musculoskeletal Disorders in Workers of Logistics Companies in Cyprus and on the Early Adoption of Industrial Exoskeletons as Mitigation Measure
Authors: Kyriacos Clerides, Panagiotis Herodotou, Constantina Polycarpou, Evagoras Xydas
Abstract:
Background: Musculoskeletal disorders (MSDs) in the workplace is a very common problem in Europe which are caused by multiple risk factors. In recent years, wearable devices and exoskeletons for the workplace have been trying to address the various risk factors that are associated with strenuous tasks in the workplace. The logistics sector is a huge sector that includes warehousing, storage, and transportation. However, the task associated with logistics is not well-studied in terms of MSDs risk. This study was aimed at looking into the MSDs affecting workers of logistics companies. It compares the prevalence of MSDs among workers and evaluates multiple risk factors that contribute to the development of MSDs. Moreover, this study seeks to obtain user feedback on the adoption of exoskeletons in such a work environment. Materials and Methods: The study was conducted among workers in logistics companies in Nicosia, Cyprus, from July to September 2022. A set of standardized questionnaires was used for collecting different types of data. Results: A high proportion of logistics professionals reported MSDs in one or more other body regions, the lower back being the most commonly affected area. Working in the same position for long periods, working in awkward postures, and handling an excessive load, were found to be the most commonly reported job risk factor that contributed to the development of MSDs, in this study. A significant number of participants consider the back region as the most to be benefited from a wearable exoskeleton device. Half of the participants would like to have at least a 50% reduction in their daily effort. The most important characteristics for the adoption of exoskeleton devices were found to be how comfortable the device is and its weight. Conclusion: Lower back and posture were the highest risk factors among all logistics professionals assessed in this study. A larger scale study using quantitative analytical tools may give a more accurate estimate of MSDs, which would pave the way for making more precise recommendations to eliminate the risk factors and thereby prevent MSDs. A follow-up study using exoskeletons in the workplace should be done to assess whether they assist in MSD prevention.Keywords: musculoskeletal disorders, occupational health, safety, occupational risk, logistic companies, workers, Cyprus, industrial exoskeletons, wearable devices
Procedia PDF Downloads 108727 Calcein Release from Liposomes Mediated by Phospholipase A₂ Activity: Effect of Cholesterol and Amphipathic Di and Tri Blocks Copolymers
Authors: Marco Soto-Arriaza, Eduardo Cena-Ahumada, Jaime Melendez-Rojel
Abstract:
Background: Liposomes have been widely used as a model of lipid bilayer to study the physicochemical properties of biological membrane, encapsulation, transport and release of different molecules. Furthermore, extensive research has focused on improving the efficiency in the transport of drugs, developing tools that improve the release of the encapsulated drug from liposomes. In this context, the enzymatic activity of PLA₂, despite having been shown to be an effective tool to promote the release of drugs from liposomes, is still an open field of research. Aim: The aim of the present study is to explore the effect of cholesterol (Cho) and amphipathic di- and tri-block copolymers, on calcein release mediated by enzymatic activity of PLA2 in Dipalmitoylphosphatidylcholine (DPPC) liposomes under physiological conditions. Methods: Different dispersions of DPPC, cholesterol, di-block POE₄₅-PCL₅₂ or tri-block PCL₁₂-POE₄₅-PCL₁₂ were prepared by the extrusion method after five freezing/thawing cycles; in Phosphate buffer 10mM pH 7.4 in presence of calcein. DPPC liposomes/Calcein were centrifuged at 15000rpm 10 min to separate free calcein. Enzymatic activity assays of PLA₂ were performed at 37°C using the TBS buffer pH 7.4. The size distribution, polydispersity, Z-potential and Calcein encapsulation of DPPC liposomes was monitored. Results: PLA₂ activity showed a slower kinetic of calcein release up to 20 mol% of cholesterol, evidencing a minimum at 10 mol% and then a maximum at 18 mol%. Regardless of the percentage of cholesterol, up to 18 mol% a one-hundred percentage release of calcein was observed. At higher cholesterol concentrations, PLA₂ showed to be inefficient or not to be involved in calcein release. In assays where copolymers were added in a concentration lower than their cmc, a similar behavior to those showed in the presence of Cho was observed, that is a slower kinetic in calcein release. In both experimental approaches, a one-hundred percentage of calcein release was observed. PLA₂ was shown to be sensitive to the 4-(4-Octadecylphenyl)-4-oxobutenoic acid inhibitor and calcium, reducing the release of calcein to 0%. Cell viability of HeLa cells decreased 7% in the presence of DPPC liposomes after 3 hours of incubation and 17% and 23% at 5 and 15 hours, respectively. Conclusion: Calcein release from DPPC liposomes, mediated by PLA₂ activity, depends on the percentage of cholesterol and the presence of copolymers. Both, cholesterol up to 20 mol% and copolymers below it cmc could be applied to the regulation of the kinetics of antitumoral drugs release without inducing cell toxicity per se.Keywords: amphipathic copolymers, calcein release, cholesterol, DPPC liposome, phospholipase A₂
Procedia PDF Downloads 166726 Reverse Engineering of a Secondary Structure of a Helicopter: A Study Case
Authors: Jose Daniel Giraldo Arias, Camilo Rojas Gomez, David Villegas Delgado, Gullermo Idarraga Alarcon, Juan Meza Meza
Abstract:
The reverse engineering processes are widely used in the industry with the main goal to determine the materials and the manufacture used to produce a component. There are a lot of characterization techniques and computational tools that are used in order to get this information. A study case of a reverse engineering applied to a secondary sandwich- hybrid type structure used in a helicopter is presented. The methodology used consists of five main steps, which can be applied to any other similar component: Collect information about the service conditions of the part, disassembly and dimensional characterization, functional characterization, material properties characterization and manufacturing processes characterization, allowing to obtain all the supports of the traceability of the materials and processes of the aeronautical products that ensure their airworthiness. A detailed explanation of each step is covered. Criticality and comprehend the functionalities of each part, information of the state of the art and information obtained from interviews with the technical groups of the helicopter’s operators were analyzed,3D optical scanning technique, standard and advanced materials characterization techniques and finite element simulation allow to obtain all the characteristics of the materials used in the manufacture of the component. It was found that most of the materials are quite common in the aeronautical industry, including Kevlar, carbon, and glass fibers, aluminum honeycomb core, epoxy resin and epoxy adhesive. The stacking sequence and volumetric fiber fraction are a critical issue for the mechanical behavior; a digestion acid method was used for this purpose. This also helps in the determination of the manufacture technique which for this case was Vacuum Bagging. Samples of the material were manufactured and submitted to mechanical and environmental tests. These results were compared with those obtained during reverse engineering, which allows concluding that the materials and manufacture were correctly determined. Tooling for the manufacture was designed and manufactured according to the geometry and manufacture process requisites. The part was manufactured and the mechanical, and environmental tests required were also performed. Finally, a geometric characterization and non-destructive techniques allow verifying the quality of the part.Keywords: reverse engineering, sandwich-structured composite parts, helicopter, mechanical properties, prototype
Procedia PDF Downloads 419725 Online Delivery Approaches of Post Secondary Virtual Inclusive Media Education
Authors: Margot Whitfield, Andrea Ducent, Marie Catherine Rombaut, Katia Iassinovskaia, Deborah Fels
Abstract:
Learning how to create inclusive media, such as closed captioning (CC) and audio description (AD), in North America is restricted to the private sector, proprietary company-based training. We are delivering (through synchronous and asynchronous online learning) the first Canadian post-secondary, practice-based continuing education course package in inclusive media for broadcast production and processes. Despite the prevalence of CC and AD taught within the field of translation studies in Europe, North America has no comparable field of study. This novel approach to audio visual translation (AVT) education develops evidence-based methodology innovations, stemming from user study research with blind/low vision and Deaf/hard of hearing audiences for television and theatre, undertaken at Ryerson University. Knowledge outcomes from the courses include a) Understanding how CC/AD fit within disability/regulatory frameworks in Canada. b) Knowledge of how CC/AD could be employed in the initial stages of production development within broadcasting. c) Writing and/or speaking techniques designed for media. d) Hands-on practice in captioning re-speaking techniques and open source technologies, or in AD techniques. e) Understanding of audio production technologies and editing techniques. The case study of the curriculum development and deployment, involving first-time online course delivery from academic and practitioner-based instructors in introductory Captioning and Audio Description courses (CDIM 101 and 102), will compare two different instructors' approaches to learning design, including the ratio of synchronous and asynchronous classroom time and technological engagement tools on meeting software platform such as breakout rooms and polling. Student reception of these two different approaches will be analysed using qualitative thematic and quantitative survey analysis. Thus far, anecdotal conversations with students suggests that they prefer synchronous compared with asynchronous learning within our hands-on online course delivery method.Keywords: inclusive media theory, broadcasting practices, AVT post secondary education, respeaking, audio description, learning design, virtual education
Procedia PDF Downloads 184724 Reasons for Lack of an Ideal Disinfectant after Dental Treatments
Authors: Ilma Robo, Saimir Heta, Rialda Xhizdari, Kers Kapaj
Abstract:
Background: The ideal disinfectant for surfaces, instruments, air, skin, both in dentistry and in the fields of medicine, does not exist.This is for the sole reason that all the characteristics of the ideal disinfectant cannot be contained in one; these are the characteristics that if one of them is emphasized, it will conflict with the other. A disinfectant must be stable, not be affected by changes in the environmental conditions where it stands, which means that it should not be affected by an increase in temperature or an increase in the humidity of the environment. Both of these elements contradict the other element of the idea of an ideal disinfectant, as they disrupt the solubility ratios of the base substance of the disinfectant versus the diluent. Material and methods: The study aims to extract the constant of each disinfectant/antiseptic used during dental disinfection protocols, accompanied by the side effects of the surface of the skin or mucosa where it is applied in the role of antiseptic. In the end, attempts were made to draw conclusions about the best possible combination for disinfectants after a dental procedure, based on the data extracted from the basic literature required during the development of the pharmacology module, as a module in the formation of a dentist, against data published in the literature. Results: The sensitivity of the disinfectant to changes in the atmospheric conditions of the environment where it is kept is a known fact. The care against this element is always accompanied by the advice on the application of the specific disinfectant, in order to have the desired clinical result. The constants of disinfectants according to the classification based on the data collected and presented are for alcohols 70-120, glycols 0.2, aldehydes 30-200, phenols 15-60, acids 100, povidone iodine halogens 5-75, hypochlorous acid halogens 150, sodium hypochlorite halogens 30-35, oxidants 18-60, metals 0.2-10. The part of halogens should be singled out, where specific results were obtained according to the representatives of this class, since it is these representatives that find scope for clinical application in dentistry. Conclusions: The search for the "ideal", in the conditions where its defining criteria are also established, not only for disinfectants but also for any medication or pharmaceutical product, is an ongoing search, without any definitive results. In this mine of data in the published literature if there is something fixed, calculable, such as the specific constant for disinfectants, the search for the ideal is more concrete. During the disinfection protocols, different disinfectants are applied since the field of action is different, including water, air, aspiration devices, tools, disinfectants used in full accordance with the production indications.Keywords: disinfectant, constant, ideal, side effects
Procedia PDF Downloads 71723 GC-MS-Based Untargeted Metabolomics to Study the Metabolism of Pectobacterium Strains
Authors: Magdalena Smoktunowicz, Renata Wawrzyniak, Malgorzata Waleron, Krzysztof Waleron
Abstract:
Pectobacterium spp. were previously classified into the Erwinia genus founded in 1917 to unite at that time all Gram-negative, fermentative, nonsporulating and peritrichous flagellated plant pathogenic bacteria. After work of Waldee (1945), on Approved Lists of Bacterial Names and bacteriology manuals in 1980, they were described either under the species named Erwinia or Pectobacterium. The Pectobacterium genus was formally described in 1998 of 265 Pectobacterium strains. Currently, there are 21 species of Pectobacterium bacteria, including Pectobacterium betavasculorum since 2003, which caused soft rot on sugar beet tubers. Based on the biochemical experiments carried out for this, it is known that these bacteria are gram-negative, catalase-positive, oxidase-negative, facultatively anaerobic, using gelatin and causing symptoms of soft rot on potato and sugar beet tubers. The mere fact of growing on sugar beet may indicate a metabolism characteristic only for this species. Metabolomics, broadly defined as the biology of the metabolic systems, which allows to make comprehensive measurements of metabolites. Metabolomics, in combination with genomics, are complementary tools for the identification of metabolites and their reactions, and thus for the reconstruction of metabolic networks. The aim of this study was to apply the GC-MS-based untargeted metabolomics to study the metabolism of P. betavasculorum in different growing conditions. The metabolomic profiles of biomass and biomass media were determined. For sample preparation the following protocol was used: extraction with 900 µl of methanol: chloroform: water mixture (10: 3: 1, v: v) were added to 900 µl of biomass from the bottom of the tube and up to 900 µl of nutrient medium from the bacterial biomass. After centrifugation (13,000 x g, 15 min, 4oC), 300µL of the obtained supernatants were concentrated by rotary vacuum and evaporated to dryness. Afterwards, two-step derivatization procedure was performed before GC-MS analyses. The obtained results were subjected to statistical calculations with the use of both uni- and multivariate tests. The obtained results were evaluated using KEGG database, to asses which metabolic pathways are activated and which genes are responsible for it, during the metabolism of given substrates contained in the growing environment. The observed metabolic changes, combined with biochemical and physiological tests, may enable pathway discovery, regulatory inference and understanding of the homeostatic abilities of P. betavasculorum.Keywords: GC-MS chromatograpfy, metabolomics, metabolism, pectobacterium strains, pectobacterium betavasculorum
Procedia PDF Downloads 81722 Strategic Innovation of Nanotechnology: Novel Applications of Biomimetics and Microfluidics in Food Safety
Authors: Boce Zhang
Abstract:
Strategic innovation of nanotechnology to promote food safety has drawn tremendous attentions among research groups, which includes the need for research support during the implementation of the Food Safety Modernization Act (FSMA) in the United States. There are urgent demands and knowledge gaps to the understanding of a) food-water-bacteria interface as for how pathogens persist and transmit during food processing and storage; b) minimum processing requirement needed to prevent pathogen cross-contamination in the food system. These knowledge gaps are of critical importance to the food industry. However, the lack of knowledge is largely hindered by the limitations of research tools. Our groups recently endeavored two novel engineering systems with biomimetics and microfluidics as a holistic approach to hazard analysis and risk mitigation, which provided unprecedented research opportunities to study pathogen behavior, in particular, contamination, and cross-contamination, at the critical food-water-pathogen interface. First, biomimetically-patterned surfaces (BPS) were developed to replicate the identical surface topography and chemistry of a natural food surface. We demonstrated that BPS is a superior research tool that empowers the study of a) how pathogens persist through sanitizer treatment, b) how to apply fluidic shear-force and surface tension to increase the vulnerability of the bacterial cells, by detaching them from a protected area, etc. Secondly, microfluidic devices were designed and fabricated to study the bactericidal kinetics in the sub-second time frame (0.1~1 second). The sub-second kinetics is critical because the cross-contamination process, which includes detachment, migration, and reattachment, can occur in a very short timeframe. With this microfluidic device, we were able to simulate and study these sub-second cross-contamination scenarios, and to further investigate the minimum sanitizer concentration needed to sufficiently prevent pathogen cross-contamination during the food processing. We anticipate that the findings from these studies will provide critical insight on bacterial behavior at the food-water-cell interface, and the kinetics of bacterial inactivation from a broad range of sanitizers and processing conditions, thus facilitating the development and implementation of science-based food safety regulations and practices to mitigate the food safety risks.Keywords: biomimetic materials, microbial food safety, microfluidic device, nanotechnology
Procedia PDF Downloads 359721 Evaluating the Ability to Cycle in Cities Using Geographic Information Systems Tools: The Case Study of Greek Modern Cities
Authors: Christos Karolemeas, Avgi Vassi, Georgia Christodoulopoulou
Abstract:
Although the past decades, planning a cycle network became an inseparable part of all transportation plans, there is still a lot of room for improvement in the way planning is made, in order to create safe and direct cycling networks that gather the parameters that positively influence one's decision to cycle. The aim of this article is to study, evaluate and visualize the bikeability of cities. This term is often used as the 'the ability of a person to bike' but this study, however, adopts the term in the sense of bikeability as 'the ability of the urban landscape to be biked'. The methodology used included assessing cities' accessibility by cycling, based on international literature and corresponding walkability methods and the creation of a 'bikeability index'. Initially, a literature review was made to identify the factors that positively affect the use of bicycle infrastructure. Those factors were used in order to create the spatial index and quantitatively compare the city network. Finally, the bikeability index was applied in two case studies: two Greek municipalities that, although, they have similarities in terms of land uses, population density and traffic congestion, they are totally different in terms of geomorphology. The factors suggested by international literature were (a) safety, (b) directness, (c) comfort and (d) the quality of the urban environment. Those factors were quantified through the following parameters: slope, junction density, traffic density, traffic speed, natural environment, built environment, activities coverage, centrality and accessibility to public transport stations. Each road section was graded for the above-mentioned parameters, and the overall grade shows the level of bicycle accessibility (low, medium, high). Each parameter, as well as the overall accessibility levels, were analyzed and visualized through Geographic Information Systems. This paper presents the bikeability index, its' results, the problems that have arisen and the conclusions from its' implementation through Strengths-Weaknesses-Opportunities-Threats analysis. The purpose of this index is to make it easy for researchers, practitioners, politicians, and stakeholders to quantify, visualize and understand which parts of the urban fabric are suitable for cycling.Keywords: accessibility, cycling, green spaces, spatial data, urban environment
Procedia PDF Downloads 112720 A Review of Digital Twins to Reduce Emission in the Construction Industry
Authors: Zichao Zhang, Yifan Zhao, Samuel Court
Abstract:
The carbon emission problem of the traditional construction industry has long been a pressing issue. With the growing emphasis on environmental protection and advancement of science and technology, the organic integration of digital technology and emission reduction has gradually become a mainstream solution. Among various sophisticated digital technologies, digital twins, which involve creating virtual replicas of physical systems or objects, have gained enormous attention in recent years as tools to improve productivity, optimize management and reduce carbon emissions. However, the relatively high implementation costs including finances, time, and manpower associated with digital twins have limited their widespread adoption. As a result, most of the current applications are primarily concentrated within a few industries. In addition, the creation of digital twins relies on a large amount of data and requires designers to possess exceptional skills in information collection, organization, and analysis. Unfortunately, these capabilities are often lacking in the traditional construction industry. Furthermore, as a relatively new concept, digital twins have different expressions and usage methods across different industries. This lack of standardized practices poses a challenge in creating a high-quality digital twin framework for construction. This paper firstly reviews the current academic studies and industrial practices focused on reducing greenhouse gas emissions in the construction industry using digital twins. Additionally, it identifies the challenges that may be encountered during the design and implementation of a digital twin framework specific to this industry and proposes potential directions for future research. This study shows that digital twins possess substantial potential and significance in enhancing the working environment within the traditional construction industry, particularly in their ability to support decision-making processes. It proves that digital twins can improve the work efficiency and energy utilization of related machinery while helping this industry save energy and reduce emissions. This work will help scholars in this field to better understand the relationship between digital twins and energy conservation and emission reduction, and it also serves as a conceptual reference for practitioners to implement related technologies.Keywords: digital twins, emission reduction, construction industry, energy saving, life cycle, sustainability
Procedia PDF Downloads 105719 Characterization of the Blood Microbiome in Rheumatoid Arthritis Patients Compared to Healthy Control Subjects Using V4 Region 16S rRNA Sequencing
Authors: D. Hammad, D. P. Tonge
Abstract:
Rheumatoid arthritis (RA) is a disabling and common autoimmune disease during which the body's immune system attacks healthy tissues. This results in complicated and long-lasting actions being carried out by the immune system, which typically only occurs when the immune system encounters a foreign object. In the case of RA, the disease affects millions of people and causes joint inflammation, ultimately leading to the destruction of cartilage and bone. Interestingly, the disease mechanism still remains unclear. It is likely that RA occurs as a result of a complex interplay of genetic and environmental factors including an imbalance in the microorganism population inside our body. The human microbiome or microbiota is an extensive community of microorganisms in and on the bodies of animals, which comprises bacteria, fungi, viruses, and protozoa. Recently, the development of molecular techniques to characterize entire bacterial communities has renewed interest in the involvement of the microbiome in the development and progression of RA. We believe that an imbalance in some of the specific bacterial species in the gut, mouth and other sites may lead to atopobiosis; the translocation of these organisms into the blood, and that this may lead to changes in immune system status. The aim of this study was, therefore, to characterize the microbiome of RA serum samples in comparison to healthy control subjects using 16S rRNA gene amplification and sequencing. Serum samples were obtained from healthy control volunteers and from patients with RA both prior to, and following treatment. The bacterial community present in each sample was identified utilizing V4 region 16S rRNA amplification and sequencing. Bacterial identification, to the lowest taxonomic rank, was performed using a range of bioinformatics tools. Significantly, the proportions of the Lachnospiraceae, Ruminococcaceae, and Halmonadaceae families were significantly increased in the serum of RA patients compared with healthy control serum. Furthermore, the abundance of Bacteroides and Lachnospiraceae nk4a136_group, Lachnospiraceae_UGC-001, RuminococcaceaeUCG-014, Rumnococcus-1, and Shewanella was also raised in the serum of RA patients relative to healthy control serum. These data support the notion of a blood microbiome and reveal RA-associated changes that may have significant implications for biomarker development and may present much-needed opportunities for novel therapeutic development.Keywords: blood microbiome, gut and oral bacteria, Rheumatoid arthritis, 16S rRNA gene sequencing
Procedia PDF Downloads 132718 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Primary Distant Metastases Growth
Authors: Ella Tyuryumina, Alexey Neznanov
Abstract:
Finding algorithms to predict the growth of tumors has piqued the interest of researchers ever since the early days of cancer research. A number of studies were carried out as an attempt to obtain reliable data on the natural history of breast cancer growth. Mathematical modeling can play a very important role in the prognosis of tumor process of breast cancer. However, mathematical models describe primary tumor growth and metastases growth separately. Consequently, we propose a mathematical growth model for primary tumor and primary metastases which may help to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoM-IV and corresponding software. We are interested in: 1) modelling the whole natural history of primary tumor and primary metastases; 2) developing adequate and precise CoM-IV which reflects relations between PT and MTS; 3) analyzing the CoM-IV scope of application; 4) implementing the model as a software tool. The CoM-IV is based on exponential tumor growth model and consists of a system of determinate nonlinear and linear equations; corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and primary metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for primary metastases; 3) ‘visible period’ for primary metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-IV model and predictive software: a) detect different growth periods of primary tumor and primary metastases; b) make forecast of the period of primary metastases appearance; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of BC and facilitate optimization of diagnostic tests. The following are calculated by CoM-IV: the number of doublings for ‘nonvisible’ and ‘visible’ growth period of primary metastases; tumor volume doubling time (days) for ‘nonvisible’ and ‘visible’ growth period of primary metastases. The CoM-IV enables, for the first time, to predict the whole natural history of primary tumor and primary metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-IV describes correctly primary tumor and primary distant metastases growth of IV (T1-4N0-3M1) stage with (N1-3) or without regional metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and manifestation of primary metastases.Keywords: breast cancer, exponential growth model, mathematical modelling, primary metastases, primary tumor, survival
Procedia PDF Downloads 335717 Assessing Gender Mainstreaming Practices in the Philippine Basic Education System
Authors: Michelle Ablian Mejica
Abstract:
Female drop-outs due to teenage pregnancy and gender-based violence in schools are two of the most contentious and current gender-related issues faced by the Department of Education (DepEd) in the Philippines. The country adopted gender mainstreaming as the main strategy to eliminate gender inequalities in all aspects of the society including education since 1990. This research examines the extent and magnitude by which gender mainstreaming is implemented in the basic education from the national to the school level. It seeks to discover the challenges faced by the central and field offices, particularly by the principals who served as decision-makers in the schools where teaching and learning take place and where opportunities that may aggravate, conform and transform gender inequalities and hierarchies exist. The author conducted surveys and interviews among 120 elementary and secondary principals in the Division of Zambales as well as selected gender division and regional focal persons within Region III- Central Luzon. The study argues that DepEd needs to review, strengthen and revitalize its gender mainstreaming because the efforts do not penetrate the schools and are not enough to lessen or eliminate gender inequalities within the schools. The study found out some of the major challenges in the implementation of gender mainstreaming as follows: absence of a national gender-responsive education policy framework, lack of gender responsive assessment and monitoring tools, poor quality of gender and development related training programs and poor data collection and analysis mechanism. Furthermore, other constraints include poor coordination mechanism among implementing agencies, lack of clear implementation strategy, ineffective or poor utilization of GAD budget and lack of teacher and learner centered GAD activities. The paper recommends the review of the department’s gender mainstreaming efforts to align with the mandate of the agency and provide gender responsive teaching and learning environment. It suggests that the focus must be on formulation of gender responsive policies and programs, improvement of the existing mechanism and conduct of trainings focused on gender analysis, budgeting and impact assessment not only for principals and GAD focal point system but also to parents and other school stakeholders.Keywords: curriculum and instruction, gender analysis, gender budgeting, gender impact assessment
Procedia PDF Downloads 348716 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms
Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson
Abstract:
This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection
Procedia PDF Downloads 465