Search results for: virtual environments computer auditing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5373

Search results for: virtual environments computer auditing

2103 Development of Boro-Tellurite Glasses Enhanced with HfO2 for Radiation Shielding: Examination of Optical and Physical Characteristics

Authors: Sleman Yahya Rasul

Abstract:

Due to their transparency, various types of glass are utilized in numerous applications where clear visibility is essential. One such application involves environments where radiography, radiotherapy, and X-ray devices are used, all of which involve exposure to radiation. As is well-known, radiation can be lethal to humans. Consequently, there is a need for glass that can absorb and block these harmful rays in such settings. Effective protection from radiation typically requires materials with high atomic numbers and densities. Currently, lead oxide-infused glasses are commonly used for this purpose, but due to the toxicity of lead oxide, there is a demand for safer alternatives. HfO2 has been selected as an additive for boro-tellurite (M1-M2-M3) glasses intended for radiation shielding because it has a high atomic number, high density, and is non-toxic. In this study, new glasses will be developed as alternatives to leaded glasses by incorporating x mol% HfO2 into the boro-tellurite glass structure. The glass compositions will be melted and quenched using the traditional method in an alumina crucible at temperatures between 900–1100°C. The resulting glasses will be evaluated for their elastic properties (including elastic modulus, shear modulus, bulk modulus, and Poisson ratio), density, hardness, and fracture toughness. X-ray diffraction (XRD) will be used to examine the amorphous nature of the glasses, while Differential Thermal Analysis (DTA) will provide thermal analysis. Optical properties will be assessed through UV-Vis and Photoluminescence Spectroscopy, and structural properties will be studied using Raman spectroscopy and FTIR spectroscopy. Additionally, the radiation shielding capabilities will be investigated by measuring parameters such as mass attenuation coefficient, half-value thickness, mean free path, effective atomic number (Z_eff), and effective electron density (N_e). The aim of this study is to develop new, lead-free glasses with excellent optical properties and high mechanical strength to replace the leaded glasses currently used for radiation shielding.

Keywords: boro-tellurite glasses, hfo2, radiation shielding, mechanical properties, elastic properties, optical properties

Procedia PDF Downloads 43
2102 A Shared Space: A Pioneering Approach to Interprofessional Education in New Zealand

Authors: Maria L. Ulloa, Ruth M. Crawford, Stephanie Kelly, Joey Domdom

Abstract:

In recent decades health and social service delivery have become more collaborative and interdisciplinary. Emerging trends suggest the need for an integrative and interprofessional approach to meet the challenges faced by professionals navigating the complexities of health and social service practice environments. Terms such as multidisciplinary practice, interprofessional collaboration, interprofessional education and transprofessional practice have become the common language used across a range of social services and health providers in western democratic systems. In Aotearoa New Zealand, one example of an interprofessional collaborative approach to curriculum design and delivery in health and social service is the development of an innovative Masters of Professional Practice programme. This qualification is the result of a strategic partnership between two tertiary institutions – Whitireia New Zealand (NZ) and the Wellington Institute of Technology (Weltec) in Wellington. The Master of Professional Practice programme was designed and delivered from the perspective of a collaborative, interprofessional and relational approach. Teachers and students in the programme come from a diverse range of cultural, professional and personal backgrounds and are engaged in courses using a blended learning approach that incorporates the values and pedagogies of interprofessional education. Students are actively engaged in professional practice while undertaking the programme. This presentation describes the themes of exploratory qualitative formative observations of engagement in class and online, student assessments, student research projects, as well as qualitative interviews with the programme teaching staff. These formative findings reveal the development of critical practice skills around the common themes of the programme: research and evidence based practice, education, leadership, working with diversity and advancing critical reflection of professional identities and interprofessional practice. This presentation will provide evidence of enhanced learning experiences in higher education and learning in multi-disciplinary contexts.

Keywords: diversity, exploratory research, interprofessional education, professional identity

Procedia PDF Downloads 302
2101 Durability Performances of Epoxy Resin/TiO₂ Composited Alkali-Activated Slag/Fly Ash Pastes in Phosphoric Acid Solution

Authors: Jie Ren, Siyao Guo

Abstract:

Laden with phosphates at a low pH value, sewage wastewater aggressive environments constitute a great threat to concrete-based pipes which is made of alkaline cementitious materials such as ordinary Portland cement (OPC). As a promising alternative for OPC-based binders, alkali-activated slag/fly ash (AASF) cementitious binders are generally believed to gain similar or better properties compared to OPC-based counterparts, especially durability. However, there is limited research on the performance of AASF binders in phosphoric acid solution. Moreover, the behavior of AASF binders composited with epoxy resin/TiO₂ when exposed to acidic media has been rarely explored. In this study, the performance of AASF paste with the precursor slag:fly ash (50:50 in mass ratio) enhanced with epoxy resin/TiO₂ composite in phosphoric acid solution (pH = 3.0-4.0) was investigated. The exposure towards acid attack lasted for 90 days. The same AASF mixture without resin/TiO₂ composite was used as a reference. The compressive strength and porous-related properties prior to acidic immersion were tested. The mass variations and degradation depth of the two mixtures of binders were also monitored which is based on phenolphthalein-videomicroscope method. The results show that the binder with epoxy resin/TiO₂ addition gained a higher compressive strength and lower water absorption than the reference. In addition, it also displayed a higher resistance towards acid attack indicated by a less mass loss and less degradation depth compared to the control sample. This improvement can be attributed to a dense microstructure evidenced by the higher compressive strength and related porous structures. It can be concluded that the microstructure can be improved by adding epoxy resin/TiO₂ composite in order to enhance the resistance of AASF binder towards acid attacks.

Keywords: alkali-activated paste, epoxy resin/TiO₂, composites, mechanical properties, phosphoric acid

Procedia PDF Downloads 121
2100 DOS and DDOS Attacks

Authors: Amin Hamrahi, Niloofar Moghaddam

Abstract:

Denial of Service is for denial-of-service attack, a type of attack on a network that is designed to bring the network to its knees by flooding it with useless traffic. Denial of Service (DoS) attacks have become a major threat to current computer networks. Many recent DoS attacks were launched via a large number of distributed attacking hosts in the Internet. These attacks are called distributed denial of service (DDoS) attacks. To have a better understanding on DoS attacks, this article provides an overview on existing DoS and DDoS attacks and major defense technologies in the Internet.

Keywords: denial of service, distributed denial of service, traffic, flooding

Procedia PDF Downloads 392
2099 Community Structure Detection in Networks Based on Bee Colony

Authors: Bilal Saoud

Abstract:

In this paper, we propose a new method to find the community structure in networks. Our method is based on bee colony and the maximization of modularity to find the community structure. We use a bee colony algorithm to find the first community structure that has a good value of modularity. To improve the community structure, that was found, we merge communities until we get a community structure that has a high value of modularity. We provide a general framework for implementing our approach. We tested our method on computer-generated and real-world networks with a comparison to very known community detection methods. The obtained results show the effectiveness of our proposition.

Keywords: bee colony, networks, modularity, normalized mutual information

Procedia PDF Downloads 406
2098 Organisational Change: The Impact on Employees and Organisational Development

Authors: Maureen Royce, Joshi Jariwala, Sally Kah

Abstract:

Change is inevitable, but the change process is progressive. Organisational change is the process in which an organisation changes strategies, operational methods, systems, culture, and structure to affect something different in the organisation. This process can be continuous or developed over a period and driven by internal and external factors. Organisational change is essential if organisations are to survive in dynamic and uncertain environments. However, evidence from research shows that many change initiatives fail, leading to severe consequences for organisations and their resources. The complex models of third sector organisations, i.e., social enterprise, compounds the levels of change in these organisations. Interestingly, innovation is associated with a change in social enterprises due to the hybridity of product and service development. Furthermore, the creation of social intervention has offered a new process and outcomes to the lifecycle of change. Therefore, different forms of organisational innovation are developed, i.e., total, evolutionary, expansionary, and developmental, which affect the interventions of social enterprises. This raises both theoretical and business concerns on how the competing hybrid nature of social enterprises change, how change is managed, and the impact on these organisations. These perspectives present critical questions for further investigation. In this study, we investigate the impact of organisational change on employees and organisational development at DaDaFest –a disability arts organisation with a social focus based in Liverpool. The three main objectives are to explore the drivers of change and the implementation process; to examine the impact of organisational change on employees and; to identify barriers to organisation change and development. To address the preceding research objectives, qualitative research design is adopted using semi-structured interviews. Data is analysed using a six-step thematic analysis framework, which enables the study to develop themes depicting the impact of change on employees and organisational development. This study presents theoretical and practical contributions for academics and practitioners. The knowledge contributions encapsulate the evolution of change and the change cycle in a social enterprise. However, practical implications provide critical insights into the change management process and the impact of change on employees and organisational development.

Keywords: organisational change, change management, organisational change system, social enterprise

Procedia PDF Downloads 126
2097 Developing a Maturity Model of Digital Twin Application for Infrastructure Asset Management

Authors: Qingqing Feng, S. Thomas Ng, Frank J. Xu, Jiduo Xing

Abstract:

Faced with unprecedented challenges including aging assets, lack of maintenance budget, overtaxed and inefficient usage, and outcry for better service quality from the society, today’s infrastructure systems has become the main focus of many metropolises to pursue sustainable urban development and improve resilience. Digital twin, being one of the most innovative enabling technologies nowadays, may open up new ways for tackling various infrastructure asset management (IAM) problems. Digital twin application for IAM, as its name indicated, represents an evolving digital model of intended infrastructure that possesses functions including real-time monitoring; what-if events simulation; and scheduling, maintenance, and management optimization based on technologies like IoT, big data and AI. Up to now, there are already vast quantities of global initiatives of digital twin applications like 'Virtual Singapore' and 'Digital Built Britain'. With digital twin technology permeating the IAM field progressively, it is necessary to consider the maturity of the application and how those institutional or industrial digital twin application processes will evolve in future. In order to deal with the gap of lacking such kind of benchmark, a draft maturity model is developed for digital twin application in the IAM field. Firstly, an overview of current smart cities maturity models is given, based on which the draft Maturity Model of Digital Twin Application for Infrastructure Asset Management (MM-DTIAM) is developed for multi-stakeholders to evaluate and derive informed decision. The process of development follows a systematic approach with four major procedures, namely scoping, designing, populating and testing. Through in-depth literature review, interview and focus group meeting, the key domain areas are populated, defined and iteratively tuned. Finally, the case study of several digital twin projects is conducted for self-verification. The findings of the research reveal that: (i) the developed maturity model outlines five maturing levels leading to an optimised digital twin application from the aspects of strategic intent, data, technology, governance, and stakeholders’ engagement; (ii) based on the case study, levels 1 to 3 are already partially implemented in some initiatives while level 4 is on the way; and (iii) more practices are still needed to refine the draft to be mutually exclusive and collectively exhaustive in key domain areas.

Keywords: digital twin, infrastructure asset management, maturity model, smart city

Procedia PDF Downloads 157
2096 A Longitudinal Exploration into Computer-Mediated Communication Use (CMC) and Relationship Change between 2005-2018

Authors: Laurie Dempsey

Abstract:

Relationships are considered to be beneficial for emotional wellbeing, happiness and physical health. However, they are also complicated: individuals engage in a multitude of complex and volatile relationships during their lifetime, where the change to or ending of these dynamics can be deeply disruptive. As the internet is further integrated into everyday life and relationships are increasingly mediated, Media Studies’ and Sociology’s research interests intersect and converge. This study longitudinally explores how relationship change over time corresponds with the developing UK technological landscape between 2005-2018. Since the early 2000s, the use of computer-mediated communication (CMC) in the UK has dramatically reshaped interaction. Its use has compelled individuals to renegotiate how they consider their relationships: some argue it has allowed for vast networks to be accumulated and strengthened; others contend that it has eradicated the core values and norms associated with communication, damaging relationships. This research collaborated with UK media regulator Ofcom, utilising the longitudinal dataset from their Adult Media Lives study to explore how relationships and CMC use developed over time. This is a unique qualitative dataset covering 2005-2018, where the same 18 participants partook in annual in-home filmed depth interviews. The interviews’ raw video footage was examined year-on-year to consider how the same people changed their reported behaviour and outlooks towards their relationships, and how this coincided with CMC featuring more prominently in their everyday lives. Each interview was transcribed, thematically analysed and coded using NVivo 11 software. This study allowed for a comprehensive exploration into these individuals’ changing relationships over time, as participants grew older, experienced marriages or divorces, conceived and raised children, or lost loved ones. It found that as technology developed between 2005-2018, everyday CMC use was increasingly normalised and incorporated into relationship maintenance. It played a crucial role in altering relationship dynamics, even factoring in the breakdown of several ties. Three key relationships were identified as being shaped by CMC use: parent-child; extended family; and friendships. Over the years there were substantial instances of relationship conflict: for parents renegotiating their dynamic with their child as they tried to both restrict and encourage their child’s technology use; for estranged family members ‘forced’ together in the online sphere; and for friendships compelled to publicly display their relationship on social media, for fear of social exclusion. However, it was also evident that CMC acted as a crucial lifeline for these participants, providing opportunities to strengthen and maintain their bonds via previously unachievable means, both over time and distance. A longitudinal study of this length and nature utilising the same participants does not currently exist, thus provides crucial insight into how and why relationship dynamics alter over time. This unique and topical piece of research draws together Sociology and Media Studies, illustrating how the UK’s changing technological landscape can reshape one of the most basic human compulsions. This collaboration with Ofcom allows for insight that can be utilised in both academia and policymaking alike, making this research relevant and impactful across a range of academic fields and industries.

Keywords: computer mediated communication, longitudinal research, personal relationships, qualitative data

Procedia PDF Downloads 121
2095 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 389
2094 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 126
2093 Computational Assistance of the Research, Using Dynamic Vector Logistics of Processes for Critical Infrastructure Subjects Continuity

Authors: Urbánek Jiří J., Krahulec Josef, Urbánek Jiří F., Johanidesová Jitka

Abstract:

These Computational assistance for the research and modelling of critical infrastructure subjects continuity deal with this paper. It enables us the using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It serves for crisis situations investigation and modelling within the organizations of critical infrastructure. In the first part of the paper, it will be introduced entities, operators and actors of DYVELOP method. It uses just three operators of Boolean algebra and four types of the entities: the Environments, the Process Systems, the Cases and the Controlling. The Process Systems (PrS) have five “brothers”: Management PrS, Transformation PrS, Logistic PrS, Event PrS and Operation PrS. The Cases have three “sisters”: Process Cell Case, Use Case and Activity Case. They all need for the controlling of their functions special Ctrl actors, except ENV – it can do without Ctrl. Model´s maps are named the Blazons and they are able mathematically - graphically express the relationships among entities, actors and processes. In the second part of this paper, the rich blazons of DYVELOP method will be used for the discovering and modelling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission. The crisis management of energetic crisis infrastructure organization is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is a necessary condition for the encompassment of the both the emergency event and the mitigation of organization´s damages. Uninterrupted and continuous cycling process bring for crisis management fruitfulness and it is a good indicator and controlling actor of organizational continuity and its sustainable development advanced possibilities. The research reliable rules are derived for the safety and reliable continuity of energetic critical infrastructure organization in the crisis situation.

Keywords: blazons, computational assistance, DYVELOP method, critical infrastructure

Procedia PDF Downloads 382
2092 Detecting Tomato Flowers in Greenhouses Using Computer Vision

Authors: Dor Oppenheim, Yael Edan, Guy Shani

Abstract:

This paper presents an image analysis algorithm to detect and count yellow tomato flowers in a greenhouse with uneven illumination conditions, complex growth conditions and different flower sizes. The algorithm is designed to be employed on a drone that flies in greenhouses to accomplish several tasks such as pollination and yield estimation. Detecting the flowers can provide useful information for the farmer, such as the number of flowers in a row, and the number of flowers that were pollinated since the last visit to the row. The developed algorithm is designed to handle the real world difficulties in a greenhouse which include varying lighting conditions, shadowing, and occlusion, while considering the computational limitations of the simple processor in the drone. The algorithm identifies flowers using an adaptive global threshold, segmentation over the HSV color space, and morphological cues. The adaptive threshold divides the images into darker and lighter images. Then, segmentation on the hue, saturation and volume is performed accordingly, and classification is done according to size and location of the flowers. 1069 images of greenhouse tomato flowers were acquired in a commercial greenhouse in Israel, using two different RGB Cameras – an LG G4 smartphone and a Canon PowerShot A590. The images were acquired from multiple angles and distances and were sampled manually at various periods along the day to obtain varying lighting conditions. Ground truth was created by manually tagging approximately 25,000 individual flowers in the images. Sensitivity analyses on the acquisition angle of the images, periods throughout the day, different cameras and thresholding types were performed. Precision, recall and their derived F1 score were calculated. Results indicate better performance for the view angle facing the flowers than any other angle. Acquiring images in the afternoon resulted with the best precision and recall results. Applying a global adaptive threshold improved the median F1 score by 3%. Results showed no difference between the two cameras used. Using hue values of 0.12-0.18 in the segmentation process provided the best results in precision and recall, and the best F1 score. The precision and recall average for all the images when using these values was 74% and 75% respectively with an F1 score of 0.73. Further analysis showed a 5% increase in precision and recall when analyzing images acquired in the afternoon and from the front viewpoint.

Keywords: agricultural engineering, image processing, computer vision, flower detection

Procedia PDF Downloads 329
2091 Play-Based Early Education and Teachers’ Professional Development: Impact on Vulnerable Children

Authors: Chirine Dannaoui, Maya Antoun

Abstract:

This paper explores the intricate dynamics of play-based early childhood education (ECE) and the impact of professional development on teachers implementing play-based pedagogy, particularly in the context of vulnerable Syrian refugee children in Lebanon. By utilizing qualitative methodologies, including classroom observations and in-depth interviews with five early childhood educators and a field manager, this study delves into the challenges and transformations experienced by teachers in adopting play-based learning strategies. The research unveils the critical role of continuous and context-specific professional development in empowering teachers to implement play-based pedagogies effectively. When appropriately supported, it emphasizes how such educational approaches significantly enhance children's cognitive, social, and emotional development in crisis-affected environments. Key findings indicate that despite diverse educational backgrounds, teachers show considerable growth in their pedagogical skills through targeted professional development. This growth is vital for fostering a learning environment where vulnerable children can thrive, particularly in humanitarian settings. The paper also addresses educators' challenges, including adapting to play-based methodologies, resource limitations, and balancing curricular requirements with the need for holistic child development. This study contributes to the discourse on early childhood education in crisis contexts, emphasizing the need for sustainable, well-structured professional development programs. It underscores the potential of play-based learning to bridge educational gaps and contribute to the healing process of children facing calamity. The study highlights significant implications for policymakers, educators, schools, and not-for-profit organizations engaged in early childhood education in humanitarian contexts, stressing the importance of investing in teacher capacity and curriculum reform to enhance the quality of education for children in general and vulnerable ones in particular.

Keywords: play-based learning, professional development, vulnerable children, early childhood education

Procedia PDF Downloads 59
2090 AI-Powered Conversation Tools - Chatbots: Opportunities and Challenges That Present to Academics within Higher Education

Authors: Jinming Du

Abstract:

With the COVID-19 pandemic beginning in 2020, many higher education institutions and education systems are turning to hybrid or fully distance online courses to maintain social distance and provide a safe virtual space for learning and teaching. However, the majority of faculty members were not well prepared for the shift to blended or distance learning. Communication frustrations are prevalent in both hybrid and full-distance courses. A systematic literature review was conducted by a comprehensive analysis of 1688 publications that focused on the application of the adoption of chatbots in education. This study aimed to explore instructors' experiences with chatbots in online and blended undergraduate English courses. Language learners are overwhelmed by the variety of information offered by many online sites. The recently emerged chatbots (e.g.: ChatGPT) are slightly superior in performance as compared to those traditional through previous technologies such as tapes, video recorders, and websites. The field of chatbots has been intensively researched, and new methods have been developed to demonstrate how students can best learn and practice a new language in the target language. However, it is believed that among the many areas where chatbots are applied, while chatbots have been used as effective tools for communicating with business customers, in consulting and targeting areas, and in the medical field, chatbots have not yet been fully explored and implemented in the field of language education. This issue is challenging enough for language teachers; they need to study and conduct research carefully to clarify it. Pedagogical chatbots may alleviate the perception of a lack of communication and feedback from instructors by interacting naturally with students through scaffolding the understanding of those learners, much like educators do. However, educators and instructors lack the proficiency to effectively operate this emerging AI chatbot technology and require comprehensive study or structured training to attain competence. There is a gap between language teachers’ perceptions and recent advances in the application of AI chatbots to language learning. The results of the study found that although the teachers felt that the chatbots did the best job of giving feedback, the teachers needed additional training to be able to give better instructions and to help them assist in teaching. Teachers generally perceive the utilization of chatbots to offer substantial assistance to English language instruction.

Keywords: artificial intelligence in education, chatbots, education and technology, education system, pedagogical chatbot, chatbots and language education

Procedia PDF Downloads 66
2089 Molecular Dissection of Late Flowering under a Photoperiod-Insensitive Genetic Background in Soybean

Authors: Fei Sun, Meilan Xu, Jianghui Zhu, Maria Stefanie Dwiyanti, Cheolwoo Park, Fanjiang Kong, Baohui Liu, Tetsuya Yamada, Jun Abe

Abstract:

Reduced or lack of sensitivity to long daylengths is a key character for soybean, a short-day crop, to adapt to higher latitudinal environments. However, the photoperiod-insensitivity often results in a reduction of the duration of vegetative growth and final yield. To overcome this limitation, a photoperiod insensitive line (RIL16) was developed in this study that delayed flowering from the recombinant inbred population derived from a cross between a photoperiod-insensitive cultivar AGS292 and a late-flowering Thai cultivar K3. Expression analyses under SD and LD conditions revealed that the expression levels of FLOWERING LOCUS T (FT) orthologues, FT2a and FT5a, were lowered in RIL16 relative to AGS292, although the expression of E1, a soybean-specific suppressor for FTs, was inhibited in both conditions. A soybean orthologue of TARGET OF EAT1 (TOE1), another suppressor of FT, showed an upregulated expression in RIL16, which appeared to reflect a lower expression of miR172a. Our data suggest that the delayed flowering of RIL16 most likely is controlled by genes involved in an age-dependent pathway in flowering. The QTL analysis based on 1,125 SNPs obtained from Restriction Site Associated DNA Sequencing revealed two major QTLs for flowering dates in Chromosome 16 and two minor QTLs in Chromosome 4, all of which accounted for 55% and 48% of the whole variations observed in natural day length and artificially-induced long day length conditions, respectively. The intervals of the major QTLs harbored FT2a and FT5a, respectively, on the basis of annotated genes in the Williams 82 reference genome. Sequencing analysis further revealed a nonsynonymous mutation in FT2a and an SNP in the 3′ UTR region of FT5a. A further study may elucidate a detailed mechanism underlying the QTL for late flowering. The alleles from K3 at the two QTLs can be used singly or in combination to retain an appropriate duration of vegetative growth to maximize the final yield of photoperiod-insensitive soybeans.

Keywords: FT genes, miR72a, photoperiod-insensitive, soybean flowering

Procedia PDF Downloads 221
2088 Revisiting Historical Illustrations in the Age of Digital Anatomy Education

Authors: Julia Wimmers-Klick

Abstract:

In the contemporary study of anatomy, medical students utilize a diverse array of resources, including lab handouts, lectures, and, increasingly, digital media such as interactive anatomy apps and digital images. Notably, a significant shift has occurred, with fewer students possessing traditional anatomy atlases or books, reflecting a broader trend towards digital approaches like Virtual Reality, Augmented Reality, and web-based programs. This paper seeks to explore the evolution of anatomy education by contrasting current digital tools with historical resources, such as classical anatomical illustrations and atlases, to assess their relevance and potential benefits in modern medical education. Through a comprehensive literature review, the development of anatomical illustrations is traced from the textual descriptions of Galen to the detailed and artistic representations of Da Vinci, Vesalius, and later anatomists. The examination includes how the printing press facilitated the dissemination of anatomical knowledge, transforming covert dissections into public spectacles and formalized teaching practices. Historical illustrations, often influenced by societal, religious, and aesthetic contexts, not only served educational purposes but also reflected the prevailing medical knowledge and ethical standards of their times. Critical questions are raised about the place of historical illustrations in today's anatomy curriculum. Specifically, their potential to teach critical thinking, highlight the history of medicine, and offer unique insights into past societal conditions are explored. These resources are viewed in their context, including the lack of diversity and the presence of ethical concerns, such as the use of illustrations from unethical sources like Pernkopf’s atlas. In conclusion, while digital tools offer innovative ways to visualize and interact with anatomical structures, historical illustrations provide irreplaceable value in understanding the evolution of medical knowledge and practice. The study advocates for a balanced approach that integrates traditional and modern resources to enrich medical education, promote critical thinking, and provide a comprehensive understanding of anatomy. Future research should investigate the optimal combination of these resources to meet the evolving needs of medical learners and the implications of the digital shift in anatomy education.

Keywords: human anatomy, historical illustrations, historical context, medical education

Procedia PDF Downloads 21
2087 Threat Modeling Methodology for Supporting Industrial Control Systems Device Manufacturers and System Integrators

Authors: Raluca Ana Maria Viziteu, Anna Prudnikova

Abstract:

Industrial control systems (ICS) have received much attention in recent years due to the convergence of information technology (IT) and operational technology (OT) that has increased the interdependence of safety and security issues to be considered. These issues require ICS-tailored solutions. That led to the need to creation of a methodology for supporting ICS device manufacturers and system integrators in carrying out threat modeling of embedded ICS devices in a way that guarantees the quality of the identified threats and minimizes subjectivity in the threat identification process. To research, the possibility of creating such a methodology, a set of existing standards, regulations, papers, and publications related to threat modeling in the ICS sector and other sectors was reviewed to identify various existing methodologies and methods used in threat modeling. Furthermore, the most popular ones were tested in an exploratory phase on a specific PLC device. The outcome of this exploratory phase has been used as a basis for defining specific characteristics of ICS embedded devices and their deployment scenarios, identifying the factors that introduce subjectivity in the threat modeling process of such devices, and defining metrics for evaluating the minimum quality requirements of identified threats associated to the deployment of the devices in existing infrastructures. Furthermore, the threat modeling methodology was created based on the previous steps' results. The usability of the methodology was evaluated through a set of standardized threat modeling requirements and a standardized comparison method for threat modeling methodologies. The outcomes of these verification methods confirm that the methodology is effective. The full paper includes the outcome of research on different threat modeling methodologies that can be used in OT, their comparison, and the results of implementing each of them in practice on a PLC device. This research is further used to build a threat modeling methodology tailored to OT environments; a detailed description is included. Moreover, the paper includes results of the evaluation of created methodology based on a set of parameters specifically created to rate threat modeling methodologies.

Keywords: device manufacturers, embedded devices, industrial control systems, threat modeling

Procedia PDF Downloads 80
2086 Network Security Attacks and Defences

Authors: Ranbir Singh, Deepinder Kaur

Abstract:

Network security is an important aspect in every field like government offices, Educational Institute and any business organization. Network security consists of the policies adopted to prevent and monitor forbidden access, misuse, modification, or denial of a computer network. Network security is very complicated subject and deal by only well trained and experienced people. However, as more and more people become wired, an increasing number of people need to understand the basics of security in a networked world. The history of the network security included an introduction to the TCP/IP and interworking. Network security starts with authenticating, commonly with a username and a password. In this paper, we study about various types of attacks on network security and how to handle or prevent this attack.

Keywords: network security, attacks, denial, authenticating

Procedia PDF Downloads 404
2085 Supporting Regulation and Shared Attention to Facilitate the Foundations for Development of Children and Adolescents with Complex Individual Profiles

Authors: Patsy Tan, Dana Baltutis

Abstract:

This presentation demonstrates the effectiveness of music therapy in co-treatment with speech pathology and occupational therapy as an innovative way when working with children and adolescents with complex individual differences to facilitate communication, emotional, motor and social skills development. Each child with special needs and their carer has an individual profile which encompasses their visual-spatial, auditory, language, learning, mental health, family dynamic, sensory-motor, motor planning and sequencing profiles. The most common issues among children with special needs, especially those diagnosed with Autism Spectrum Disorder, are in the areas of regulation, communication, and social-emotional development. The ability of children living with challenges to communicate and use language and understand verbal and non-verbal information, as well as move their bodies to explore and interact with their environments in social situations, depends on the children being regulated both internally and externally and trusting their communication partners and understanding what is happening in the moment. For carers, it is about understanding the tempo, rhythm, pacing, and timing of their own individual profile, as well as the profile of the child they are interacting with, and how these can sync together. In this study, music therapy is used in co-treatment sessions with a speech pathologist and/or an occupational therapist using the DIRFloortime approach to facilitate the regulation, attention, engagement, reciprocity and social-emotional capacities of children presenting with complex individual differences. Documented changes in 10 domains of children’s development over a 12-month period using the Individual Music Therapy Assessment Profile (IMTAP) were observed. Children were assessed biannually, and results show significant improvements in the social-emotional, musicality and receptive language domains indicating that co-treatment with a music therapist using the DIRFloortime framework is highly effective. This presentation will highlight strategies that facilitate regulation, social-emotional and communication development for children and adolescents with complex individual profiles.

Keywords: communication, shared attention, regulation, social emotional

Procedia PDF Downloads 256
2084 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 86
2083 Characterization of Antibiotic Resistance in Cultivable Enterobacteriaceae Isolates from Different Ecological Niches in the Eastern Cape, South Africa

Authors: Martins A. Adefisoye, Mpaka Lindelwa, Fadare Folake, Anthony I. Okoh

Abstract:

Evolution and rapid dissemination of antibiotic resistance from one ecosystem to another has been responsible for wide-scale epidemic and endemic spreads of multi-drug resistance pathogens. This study assessed the prevalence of Enterobacteriaceae in different environmental samples, including river water, hospital effluents, abattoir wastewater, animal rectal swabs and faecal droppings, soil, and vegetables, using standard microbiological procedure. The identity of the isolates were confirmed using matrix-assisted laser desorption ionization-time of flight mass spectrophotometry (MALDI-TOF) while the isolates were profiled for resistance against a panel of 16 antibiotics using disc diffusion (DD) test, and the occurrence of resistance genes (ARG) was determined by polymerase chain reactions (PCR). Enterobacteriaceae counts in the samples range as follows: river water 4.0 × 101 – 2.0 × 104 cfu/100 ml, hospital effluents 1.5 × 103 – 3.0 × 107 cfu/100 ml, municipal wastewater 2.3 × 103 – 9.2 × 104 cfu/100 ml, faecal droppings 3.0 × 105 – 9.5 × 106 cfu/g, animal rectal swabs 3.0 × 102 – 2.9 × 107 cfu/ml, soil 0 – 1.2 × 105 cfu/g and vegetables 0 – 2.2 × 107 cfu/g. Of the 700 randomly selected presumptive isolates subjected to MALDI-TOF analysis, 129 (18.4%), 68 (9.7%), 67 (9.5%), 41 (5.9%) were E. coli, Klebsiella spp., Enterobacter spp., and Citrobacter spp. respectively while the remaining isolates belong to other genera not targeted in the study. The DD test shows resistance ranging between 91.6% (175/191) for cefuroxime and (15.2%, 29/191) for imipenem The predominant multiple antibiotic resistance phenotypes (MARP), (GM-AUG-AP-CTX-CXM-CIP-NOR-NI-C-NA-TS-T-DXT) occurred in 9 Klebsiella isolates. The multiple antibiotic resistance indices (MARI) the isolates (range 0.17–1.0) generally showed >95% had MARI above the 0.2 thresholds, suggesting that most of the isolates originate from high-risk environments with high antibiotic use and high selective pressure for the emergence of resistance. The associated ARG in the isolates include: bla TEM 61.9 (65), bla SHV 1.9 (2), bla OXA 8.6 (9), CTX-M-2 8.6 (9), CTX-M-9 6.7 (7), sul 2 26.7 (28), tet A 16.2 (17), tet M 17.1 (18), aadA 59.1 (62), strA 34.3 (36), aac(3)A 19.1 (20), (aa2)A 7.6 (8), and aph(3)-1A 10.5 (11). The results underscore the need for preventative measures to curb the proliferation of antibiotic-resistant bacteria including Enterobacteriaceae to protect public health.

Keywords: enterobacteriaceae, antibiotic-resistance, MALDI-TOF, resistance genes, MARP, MARI, public health

Procedia PDF Downloads 149
2082 Robustness of the Deep Chroma Extractor and Locally-Normalized Quarter Tone Filters in Automatic Chord Estimation under Reverberant Conditions

Authors: Luis Alvarado, Victor Poblete, Isaac Gonzalez, Yetzabeth Gonzalez

Abstract:

In MIREX 2016 (http://www.music-ir.org/mirex), the deep neural network (DNN)-Deep Chroma Extractor, proposed by Korzeniowski and Wiedmer, reached the highest score in an audio chord recognition task. In the present paper, this tool is assessed under acoustic reverberant environments and distinct source-microphone distances. The evaluation dataset comprises The Beatles and Queen datasets. These datasets are sequentially re-recorded with a single microphone in a real reverberant chamber at four reverberation times (0 -anechoic-, 1, 2, and 3 s, approximately), as well as four source-microphone distances (32, 64, 128, and 256 cm). It is expected that the performance of the trained DNN will dramatically decrease under these acoustic conditions with signals degraded by room reverberation and distance to the source. Recently, the effect of the bio-inspired Locally-Normalized Cepstral Coefficients (LNCC), has been assessed in a text independent speaker verification task using speech signals degraded by additive noise at different signal-to-noise ratios with variations of recording distance, and it has also been assessed under reverberant conditions with variations of recording distance. LNCC showed a performance so high as the state-of-the-art Mel Frequency Cepstral Coefficient filters. Based on these results, this paper proposes a variation of locally-normalized triangular filters called Locally-Normalized Quarter Tone (LNQT) filters. By using the LNQT spectrogram, robustness improvements of the trained Deep Chroma Extractor are expected, compared with classical triangular filters, and thus compensating the music signal degradation improving the accuracy of the chord recognition system.

Keywords: chord recognition, deep neural networks, feature extraction, music information retrieval

Procedia PDF Downloads 232
2081 Understanding the Information in Principal Component Analysis of Raman Spectroscopic Data during Healing of Subcritical Calvarial Defects

Authors: Rafay Ahmed, Condon Lau

Abstract:

Bone healing is a complex and sequential process involving changes at the molecular level. Raman spectroscopy is a promising technique to study bone mineral and matrix environments simultaneously. In this study, subcritical calvarial defects are used to study bone composition during healing without discomposing the fracture. The model allowed to monitor the natural healing of bone avoiding mechanical harm to the callus. Calvarial defects were created using 1mm burr drill in the parietal bones of Sprague-Dawley rats (n=8) that served in vivo defects. After 7 days, their skulls were harvested after euthanizing. One additional defect per sample was created on the opposite parietal bone using same calvarial defect procedure to serve as control defect. Raman spectroscopy (785 nm) was established to investigate bone parameters of three different skull surfaces; in vivo defects, control defects and normal surface. Principal component analysis (PCA) was utilized for the data analysis and interpretation of Raman spectra and helped in the classification of groups. PCA was able to distinguish in vivo defects from normal surface and control defects. PC1 shows that the major variation at 958 cm⁻¹, which corresponds to ʋ1 phosphate mineral band. PC2 shows the major variation at 1448 cm⁻¹ which is the characteristic band of CH2 deformation and corresponds to collagens. Raman parameters, namely, mineral to matrix ratio and crystallinity was found significantly decreased in the in vivo defects compared to surface and controls. Scanning electron microscope and optical microscope images show the formation of newly generated matrix by means of bony bridges of collagens. Optical profiler shows that surface roughness increased by 30% from controls to in vivo defects after 7 days. These results agree with Raman assessment parameters and confirm the new collagen formation during healing.

Keywords: Raman spectroscopy, principal component analysis, calvarial defects, tissue characterization

Procedia PDF Downloads 223
2080 Replication of Meaningful Gesture Study for N400 Detection Using a Commercial Brain-Computer Interface

Authors: Thomas Ousterhout

Abstract:

In an effort to test the ability of a commercial grade EEG headset to effectively measure the N400 ERP, a replication study was conducted to see if similar results could be produced as that which used a medical grade EEG. Pictures of meaningful and meaningless hand postures were borrowed from the original author and subjects were required to perform a semantic discrimination task. The N400 was detected indicating semantic processing of the meaningfulness of the hand postures. The results corroborate those of the original author and support the use of some commercial grade EEG headsets for non-critical research applications.

Keywords: EEG, ERP, N400, semantics, congruency, gestures, emotiv

Procedia PDF Downloads 263
2079 A Case Study: Social Network Analysis of Construction Design Teams

Authors: Elif D. Oguz Erkal, David Krackhardt, Erica Cochran-Hameen

Abstract:

Even though social network analysis (SNA) is an abundantly studied concept for many organizations and industries, a clear SNA approach to the project teams has not yet been adopted by the construction industry. The main challenges for performing SNA in construction and the apparent reason for this gap is the unique and complex structure of each construction project, the comparatively high circulation of project team members/contributing parties and the variety of authentic problems for each project. Additionally, there are stakeholders from a variety of professional backgrounds collaborating in a high-stress environment fueled by time and cost constraints. Within this case study on Project RE, a design & build project performed at the Urban Design Build Studio of Carnegie Mellon University, social network analysis of the project design team will be performed with the main goal of applying social network theory to construction project environments. The research objective is to determine a correlation between the network of how individuals relate to each other on one’s perception of their own professional strengths and weaknesses and the communication patterns within the team and the group dynamics. Data is collected through a survey performed over four rounds conducted monthly, detailed follow-up interviews and constant observations to assess the natural alteration in the network with the effect of time. The data collected is processed by the means of network analytics and in the light of the qualitative data collected with observations and individual interviews. This paper presents the full ethnography of this construction design team of fourteen architecture students based on an elaborate social network data analysis over time. This study is expected to be used as an initial step to perform a refined, targeted and large-scale social network data collection in construction projects in order to deduce the impacts of social networks on project performance and suggest better collaboration structures for construction project teams henceforth.

Keywords: construction design teams, construction project management, social network analysis, team collaboration, network analytics

Procedia PDF Downloads 200
2078 Generalized Up-downlink Transmission using Black-White Hole Entanglement Generated by Two-level System Circuit

Authors: Muhammad Arif Jalil, Xaythavay Luangvilay, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin

Abstract:

Black and white holes form the entangled pair⟨BH│WH⟩, where a white hole occurs when the particle moves at the same speed as light. The entangled black-white hole pair is at the center with the radian between the gap. When the speed of particle motion is slower than light, the black hole is gravitational (positive gravity), where the white hole is smaller than the black hole. On the downstream side, the entangled pair appears to have a black hole outside the gap increases until the white holes disappear, which is the emptiness paradox. On the upstream side, when moving faster than light, white holes form times tunnels, with black holes becoming smaller. It will continue to move faster and further when the black hole disappears and becomes a wormhole (Singularity) that is only a white hole in emptiness (Emptiness). This research studies use of black and white holes generated by a two-level circuit for communication transmission carriers, in which high ability and capacity of data transmission can be obtained. The black and white hole pair can be generated by the two-level system circuit when the speech of a particle on the circuit is equal to the speed of light. The black hole forms when the particle speed has increased from slower to equal to the light speed, while the white hole is established when the particle comes down faster than light. They are bound by the entangled pair, signal and idler, ⟨Signal│Idler⟩, and the virtual ones for the white hole, which has an angular displacement of half of π radian. A two-level system is made from an electronic circuit to create black and white holes bound by the entangled bits that are immune or cloning-free from thieves. Start by creating a wave-particle behavior when its speed is equal to light black hole is in the middle of the entangled pair, which is the two bit gate. The required information can be input into the system and wrapped by the black hole carrier. A timeline (Tunnel) occurs when the wave-particle speed is faster than light, from which the entangle pair is collapsed. The transmitted information is safely in the time tunnel. The required time and space can be modulated via the input for the downlink operation. The downlink is established when the particle speed is given by a frequency(energy) form is down and entered into the entangled gap, where this time the white hole is established. The information with the required destination is wrapped by the white hole and retrieved by the clients at the destination. The black and white holes are disappeared, and the information can be recovered and used.

Keywords: cloning free, time machine, teleportation, two-level system

Procedia PDF Downloads 75
2077 Enhancing Robustness in Federated Learning through Decentralized Oracle Consensus and Adaptive Evaluation

Authors: Peiming Li

Abstract:

This paper presents an innovative blockchain-based approach to enhance the reliability and efficiency of federated learning systems. By integrating a decentralized oracle consensus mechanism into the federated learning framework, we address key challenges of data and model integrity. Our approach utilizes a network of redundant oracles, functioning as independent validators within an epoch-based training system in the federated learning model. In federated learning, data is decentralized, residing on various participants' devices. This scenario often leads to concerns about data integrity and model quality. Our solution employs blockchain technology to establish a transparent and tamper-proof environment, ensuring secure data sharing and aggregation. The decentralized oracles, a concept borrowed from blockchain systems, act as unbiased validators. They assess the contributions of each participant using a Hidden Markov Model (HMM), which is crucial for evaluating the consistency of participant inputs and safeguarding against model poisoning and malicious activities. Our methodology's distinct feature is its epoch-based training. An epoch here refers to a specific training phase where data is updated and assessed for quality and relevance. The redundant oracles work in concert to validate data updates during these epochs, enhancing the system's resilience to security threats and data corruption. The effectiveness of this system was tested using the Mnist dataset, a standard in machine learning for benchmarking. Results demonstrate that our blockchain-oriented federated learning approach significantly boosts system resilience, addressing the common challenges of federated environments. This paper aims to make these advanced concepts accessible, even to those with a limited background in blockchain or federated learning. We provide a foundational understanding of how blockchain technology can revolutionize data integrity in decentralized systems and explain the role of oracles in maintaining model accuracy and reliability.

Keywords: federated learning system, block chain, decentralized oracles, hidden markov model

Procedia PDF Downloads 63
2076 Alive Cemeteries with Augmented Reality and Semantic Web Technologies

Authors: Tamás Matuszka, Attila Kiss

Abstract:

Due the proliferation of smartphones in everyday use, several different outdoor navigation systems have become available. Since these smartphones are able to connect to the Internet, the users can obtain location-based information during the navigation as well. The users could interactively get to know the specifics of a particular area (for instance, ancient cultural area, Statue Park, cemetery) with the help of thus obtained information. In this paper, we present an Augmented Reality system which uses Semantic Web technologies and is based on the interaction between the user and the smartphone. The system allows navigating through a specific area and provides information and details about the sight an interactive manner.

Keywords: augmented reality, semantic web, human computer interaction, mobile application

Procedia PDF Downloads 340
2075 Parallel Asynchronous Multi-Splitting Methods for Differential Algebraic Systems

Authors: Malika Elkyal

Abstract:

We consider an iterative parallel multi-splitting method for differential algebraic equations. The main feature of the proposed idea is to use the asynchronous form. We prove that the multi-splitting technique can effectively accelerate the convergent performance of the iterative process. The main characteristic of an asynchronous mode is that the local algorithm does not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Accordingly, we note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.

Keywords: parallel methods, asynchronous mode, multisplitting, differential algebraic equations

Procedia PDF Downloads 558
2074 Ethical Discussions on Prenatal Diagnosis: Iranian Case of Thalassemia Prevention Program

Authors: Sachiko Hosoya

Abstract:

Objectives: The purpose of this paper is to investigate the social policy of preventive genetic medicine in Iran, by following the legalization process of abortion law and the factors affecting the process in wider Iranian contexts. In this paper, ethical discussions of prenatal diagnosis and selective abortion in Iran will be presented, by exploring Iranian social policy to control genetic diseases, especially a genetic hemoglobin disorder called Thalassemia. The ethical dilemmas in application of genetic medicine into social policy will be focused. Method: In order to examine the role of the policy for prevention of genetic diseases and selective abortion in Iran, various resources have been sutudied, not only academic articles, but also discussion in the Parliament and documents related to a court case, as well as ethnographic data on living situation of Thalassemia patients. Results: Firstly, the discussion on prenatal diagnosis and selective abortion is overviewed from the viewpoints of ethics, disability rights activists, and public policy for lower-resources countries. As a result, it should be noted that the point more important in the discussion on prenatal diagnosis and selective abortion in Iran is the allocation of medical resources. Secondly, the process of implementation of national thalassemia screening program and legalization of ‘Therapeutic Abortion Law’ is analyzed, through scrutinizing documents such as the Majlis record, government documents and related laws and regulations. Although some western academics accuse that Iranian policy of selective abortion seems to be akin to eugenic public policy, Iranian government carefully avoid to distortions of the policy as ‘eugenic’. Thirdly, as a comparative example, discussions on an Iranian court case of patient’s ‘right not to be born’ will be introduced. Along with that, restrictive living environments of people with Thalassemia patients and the carriers are depicted, to understand some disabling social factors for people with genetic diseases in the local contexts of Iran.

Keywords: abortion, Iran, prenatal diagnosis, public health ethics, Thalassemia prevention program

Procedia PDF Downloads 346