Search results for: technology complexity
5891 Radiation Risks for Nurses: The Unrecognized Consequences of ERCP Procedures
Authors: Ava Zarif Sanayei, Sedigheh Sina
Abstract:
Despite the advancement of radiation-free interventions in the gastrointestinal and hepatobiliary fields, endoscopy and endoscopic retrograde cholangiopancreatography (ERCP) remain indispensable procedures that necessitate radiation exposure. ERCP, in particular, relies heavily on radiation-guided imaging to ensure precise delivery of therapy. Meanwhile, interventional radiology (IR) procedures also utilize imaging modalities like X-rays and CT scans to guide therapy, often under local anesthesia via small needle insertion. However, the complexity of these procedures raises concerns about radiation exposure to healthcare professionals, including nurses, who play a crucial role in these interventions. This study aims to assess the radiation exposure to the hands and fingers of nurses 1 and 2, who are directly involved in ERCP procedures utilizing (TLD-100) dosimeters at the Gastrointestinal Endoscopy department of a clinic in Shiraz, Iran. The dosimeters were initially calibrated using various phantoms and then a group was prepared and used over a two-month period. For personal equivalent dose measurement, two TLD chips were mounted on a finger ring to monitor exposure to the hands and fingers. Upon completion of the monitoring period, the TLDs were analyzed using a TLD reader, showing that Nurse 1 received an equivalent dose of 298.26 µSv and Nurse 2 received an equivalent dose of 195.39 µSv. The investigation revealed that the total radiation exposure to the nurses did not exceed the annual limit for occupational exposure. Nevertheless, it is essential to prioritize radiation protection measures to prevent potential harm. The study showed that positioning staff members and placing two nurses in a specific location contributed to somehow equal doses. To reduce exposure further, we suggest providing education and training on radiation safety principles, particularly for technologists.Keywords: dose measurement, ERCP, interventional radiology, medical imaging
Procedia PDF Downloads 375890 A Compact Via-less Ultra-Wideband Microstrip Filter by Utilizing Open-Circuit Quarter Wavelength Stubs
Authors: Muhammad Yasir Wadood, Fatemeh Babaeian
Abstract:
By developing ultra-wideband (UWB) systems, there is a high demand for UWB filters with low insertion loss, wide bandwidth, and having a planar structure which is compatible with other components of the UWB system. A microstrip interdigital filter is a great option for designing UWB filters. However, the presence of via holes in this structure creates difficulties in the fabrication procedure of the filter. Especially in the higher frequency band, any misalignment of the drilled via hole with the Microstrip stubs causes large errors in the measurement results compared to the desired results. Moreover, in this case (high-frequency designs), the line width of the stubs are very narrow, so highly precise small via holes are required to be implemented, which increases the cost of fabrication significantly. Also, in this case, there is a risk of having fabrication errors. To combat this issue, in this paper, a via-less UWB microstrip filter is proposed which is designed based on a modification of a conventional inter-digital bandpass filter. The novel approaches in this filter design are 1) replacement of each via hole with a quarter-wavelength open circuit stub to avoid the complexity of manufacturing, 2) using a bend structure to reduce the unwanted coupling effects and 3) minimising the size. Using the proposed structure, a UWB filter operating in the frequency band of 3.9-6.6 GHz (1-dB bandwidth) is designed and fabricated. The promising results of the simulation and measurement are presented in this paper. The selected substrate for these designs was Rogers RO4003 with a thickness of 20 mils. This is a common substrate in most of the industrial projects. The compact size of the proposed filter is highly beneficial for applications which require a very miniature size of hardware.Keywords: band-pass filters, inter-digital filter, microstrip, via-less
Procedia PDF Downloads 1595889 Improving Fake News Detection Using K-means and Support Vector Machine Approaches
Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy
Abstract:
Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine
Procedia PDF Downloads 1785888 Adaptive Certificate-Based Mutual Authentication Protocol for Mobile Grid Infrastructure
Authors: H. Parveen Begam, M. A. Maluk Mohamed
Abstract:
Mobile Grid Computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using different types of electronic portable devices. In a grid environment the security issues are like authentication, authorization, message protection and delegation handled by GSI (Grid Security Infrastructure). Proving better security between mobile devices and grid infrastructure is a major issue, because of the open nature of wireless networks, heterogeneous and distributed environments. In a mobile grid environment, the individual computing devices may be resource-limited in isolation, as an aggregated sum, they have the potential to play a vital role within the mobile grid environment. Some adaptive methodology or solution is needed to solve the issues like authentication of a base station, security of information flowing between a mobile user and a base station, prevention of attacks within a base station, hand-over of authentication information, communication cost of establishing a session key between mobile user and base station, computing complexity of achieving authenticity and security. The sharing of resources of the devices can be achieved only through the trusted relationships between the mobile hosts (MHs). Before accessing the grid service, the mobile devices should be proven authentic. This paper proposes the dynamic certificate based mutual authentication protocol between two mobile hosts in a mobile grid environment. The certificate generation process is done by CA (Certificate Authority) for all the authenticated MHs. Security (because of validity period of the certificate) and dynamicity (transmission time) can be achieved through the secure service certificates. Authentication protocol is built on communication services to provide cryptographically secured mechanisms for verifying the identity of users and resources.Keywords: mobile grid computing, certificate authority (CA), SSL/TLS protocol, secured service certificates
Procedia PDF Downloads 3085887 Compliance of Systematic Reviews in Ophthalmology with the PRISMA Statement
Authors: Seon-Young Lee, Harkiran Sagoo, Reem Farwana, Katharine Whitehurst, Alex Fowler, Riaz Agha
Abstract:
Background/Aims: Systematic reviews and meta-analysis are becoming increasingly important way of summarizing research evidence. Researches in ophthalmology may represent further challenges, due to their potential complexity in study design. The aim of our study was to determine the reporting quality of systematic reviews and meta-analysis in ophthalmology with the PRISMA statement, by assessing the articles published between 2010 and 2015 from five major journals with the highest impact factor. Methods: MEDLINE and EMBASE were used to search systematic reviews published between January 2010 and December 2015, in 5 major ophthalmology journals: Progress in Retinal and Eye Research, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology, Journal of the American Optometric Association. Screening, identification, and scoring of articles were performed independently by two teams, followed by statistical analysis including the median, range, and 95% CIs. Results: 115 articles were involved. The median PRISMA score was 15 of 27 items (56%), with a range of 5-26 (19-96%) and 95% CI 13.9-16.1 (51-60%). Compliance was highest in items related to the description of rationale (item 3,100%) and inclusion of a structured summary in the abstract (item 2, 90%), while poorest in indication of review protocol and registration (item 5, 9%), specification of risk of bias affecting the cumulative evidence (item 15, 24%) and description of clear objectives in introduction (item 4, 26%). Conclusion: The reporting quality of systematic reviews and meta-analysis in ophthalmology need significant improvement. While the use of PRISMA criteria as a guideline before journal submission is recommended, additional research identifying potential barriers may be required to improve the compliance to the PRISMA guidelines.Keywords: systematic reviews, meta-analysis, research methodology, reporting quality, PRISMA, ophthalmology
Procedia PDF Downloads 2645886 Adopted Method of Information System Strategy for Knowledge Management System: A Literature Review
Authors: Elin Cahyaningsih, Dana Indra Sensuse, Wahyu Catur Wibowo, Sofiyanti Indriasari
Abstract:
Bureaucracy reform program drives Indonesian government to change their management and supporting unit in order to enhance their organization performance. Information technology as one of supporting unit became one of strategic plan that organization tried to improve, because IT can automate and speed up process, reduce business process life cycle become more effective and efficient. Knowledge management system is a technology application for supporting knowledge management implementation in government which is requirement based on problem and potential functionality of each knowledge management process. Define knowledge management that suitable for each organization it is difficult, that why we should make the knowledge management system strategy as an alignment of knowledge management process in the organization. Knowledge management system is one of information system development in people perspective, because this system has high dependency in human interaction and participation. Strategic plan for developing knowledge management system can be determine using some of information system strategic methods. This research conducted to define type of strategic method of information system, stage of activity each method, the strategic method strength and weakness. The author use literature review methods for identify and classify strategic methods of information system for differentiate method type, categorize common activities, strength and weakness. Result of this research are determine and compare six strategic information system methods, there are Balanced Scorecard, Five Force Porter, SWOT analysis, Value Chain Analysis, Risk Analysis and Gap Analysis. Balanced Scorecard and Risk Analysis believe as common strategic method that usually used and have the highest excellence strength.Keywords: knowledge management system, balanced scorecard, five force, risk analysis, gap analysis, value chain analysis, SWOT analysis
Procedia PDF Downloads 4815885 The Regional Novel in India: Its Emergence and Trajectory
Authors: Aruna Bommareddi
Abstract:
The journey of the novel is well examined in Indian academia as an offshoot of the novel in English. There have been many attempts to understand aspects of the early novel in India which shared a commonality with the English novel. The regional novel has had an entirely different trajectory which is mapped in the paper. The main focus of the paper would be to look at the historical emergence of the genre of the regional novel in Indian Literatures with specific reference to Kannada, Hindi, and Bengali. The selection of these languages is guided not only by familiarity with these languages as also based on the significance that these languages enjoy in the sub-continent and for the emergence of the regional novel as a specific category in these languages. The regional novels under study are Phaneeswaranath Renu’s Maila Anchal, Tarashankar Bandopadhyaya’s Ganadevata, and Kuvempu’s House of Kanuru for exploration of the themes of its emergence and some aspects of the regional novel common to and different from each other. The paper would explore the various movements that have shaped the genre regional novel in these Literatures. Though Phaneeswarnath Renu’s Maila Anchal is published in 1956, the novel is set in pre-Independent India and therefore shares a commonality of themes with the other two novels, House of Kanuru and Ganadevata. All three novels explore themes of superstition, ignorance, poverty, and the interventions of educated youth to salvage the crises in these backward regional worlds. In fact, it was Renu who assertively declared that he was going to write a regional novel and hence the tile of the first regional novel in Hindi is Maila Anchal meaning the soiled border. In Hindi, anchal also means the region therefore, the title is suggestive of a dirty region as well. The novel exposes the squalor, ignorance, and the conflict ridden life of the village or region as opposed to the rosy image of the village in literature. With this, all such novels which depicted conflicts of the region got recognized as regional novels even though they may have been written prior to Renu’s declaration. All three novels under study succeed in bringing out the complexity of rural life at a given point of time in its history.Keywords: bengali, hindi, kannada, regional novel, telugu
Procedia PDF Downloads 815884 A Novel Method for Face Detection
Authors: H. Abas Nejad, A. R. Teymoori
Abstract:
Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model
Procedia PDF Downloads 3425883 Rights, Differences and Inclusion: The Role of Transdisciplinary Approach in the Education for Diversity
Authors: Ana Campina, Maria Manuela Magalhaes, Eusebio André Machado, Cristina Costa-Lobo
Abstract:
Inclusive school advocates respect for differences, for equal opportunities and for a quality education for all, including for students with special educational needs. In the pursuit of educational equity, guaranteeing equality in access and results, it becomes the responsibility of the school to recognize students' needs, adapting to the various styles and rhythms of learning, ensuring the adequacy of curricula, strategies and resources, materials and humans. This paper presents a set of theoretical reflections in the disciplinary interface between legal and education sciences, school administration and management, with the aim of understand the real inclusion characteristics in a balance with the inclusion policies and the need(s) of an education for Human Rights, especially for diversity. Considering the actual social complexity but the important education instruments and strategies, mostly patented in the policies, this paper aims expose the existing contexts opposed to the laws, policies and inclusion educational needs. More than a single study, this research aims to develop a map of the reality and the guidelines to implement the action. The results point to the usefulness and pertinence of a school in which educational managers, teachers, parents, and students, are involved in the creation, implementation and monitoring of flexible curricula and adapted to the educational needs of students, promoting a collaborative work among teachers. We are then faced with a scenario that points to the need to reflect on the legislation and curricular management of inclusive classes and to operationalize the processes of elaboration of curricular adaptations and differentiation in the classroom. The transdisciplinary is a pedagogic and social education perfect approach using the Human Rights binomio – teaching and learning – supported by the inclusion laws according to the realistic needs for an effective successful society construction.Keywords: rights, transdisciplinary, inclusion policies, education for diversity
Procedia PDF Downloads 3925882 Digital Portfolio as Mediation to Enhance Willingness to Communicate in English
Authors: Saeko Toyoshima
Abstract:
This research will discuss if performance tasks with technology would enhance students' willingness to communicate. The present study investigated how Japanese learners of English would change their attitude to communication in their target language by experiencing a performance task, called 'digital portfolio', in the classroom, applying the concepts of action research. The study adapted questionnaires including four-Likert and open-end questions as mixed-methods research. There were 28 students in the class. Many of Japanese university students with low proficiency (A1 in Common European Framework of References in Language Learning and Teaching) have difficulty in communicating in English due to the low proficiency and the lack of practice in and outside of the classroom at secondary education. They should need to mediate between themselves in the world of L1 and L2 with completing a performance task for communication. This paper will introduce the practice of CALL class where A1 level students have made their 'digital portfolio' related to the topics of TED® (Technology, Entertainment, Design) Talk materials. The students had 'Portfolio Session' twice in one term, once in the middle, and once at the end of the course, where they introduced their portfolio to their classmates and international students in English. The present study asked the students to answer a questionnaire about willingness to communicate twice, once at the end of the first term and once at the end of the second term. The four-Likert questions were statistically analyzed with a t-test, and the answers to open-end questions were analyzed to clarify the difference between them. They showed that the students had a more positive attitude to communication in English and enhanced their willingness to communicate through the experiences of the task. It will be the implication of this paper that making and presenting portfolio as a performance task would lead them to construct themselves in English and enable them to communicate with the others enjoyably and autonomously.Keywords: action research, digital portfoliio, computer-assisted language learning, ELT with CALL system, mixed methods research, Japanese English learners, willingness to communicate
Procedia PDF Downloads 1205881 Determination of Authorship of the Works Created by the Artificial Intelligence
Authors: Vladimir Sharapaev
Abstract:
This paper seeks to address the question of the authorship of copyrighted works created solely by the artificial intelligence or with the use thereof, and proposes possible interpretational or legislative solutions to the problems arising from the plurality of the persons potentially involved in the ultimate creation of the work and division of tasks among such persons. Being based on the commonly accepted assumption that a copyrighted work can only be created by a natural person, the paper does not deal with the issues regarding the creativity of the artificial intelligence per se (or the lack thereof), and instead focuses on the distribution of the intellectual property rights potentially belonging to the creators of the artificial intelligence and/or the creators of the content used for the formation of the copyrighted work. Moreover, the technical development and rapid improvement of the AI-based programmes, which tend to be reaching even greater independence on a human being, give rise to the question whether the initial creators of the artificial intelligence can be entitled to the intellectual property rights to the works created by such AI at all. As the juridical practice of some European courts and legal doctrine tends to incline to the latter opinion, indicating that the works created by the AI may not at all enjoy copyright protection, the questions of authorships appear to be causing great concerns among the investors in the development of the relevant technology. Although the technology companies dispose with further instruments of protection of their investments, the risk of the works in question not being copyrighted caused by the inconsistency of the case law and a certain research gap constitutes a highly important issue. In order to assess the possible interpretations, the author adopted a doctrinal and analytical approach to the research, systematically analysing the European and Czech copyright laws and case law in some EU jurisdictions. This study aims to contribute to greater legal certainty regarding the issues of the authorship of the AI-created works and define possible clues for further research.Keywords: artificial intelligence, copyright, authorship, copyrighted work, intellectual property
Procedia PDF Downloads 1235880 System Analysis of Quality Assurance in Online Education
Authors: Keh-Wen Carin Chuang, Kuan-Chou Chen
Abstract:
Our society is in a constant state of change. Technology advancements continue to affect our daily lives. How we work, communicate and entertain ourselves has changed dramatically in the past decades. As our society learns to accept and adapt to the many different technological advances that seem to inundate every part of our lives, the education institutions must migrate from traditional methods of instruction to online education in order to take full advantage of the opportunities provided by these technology advancements. There are many benefits that can be gained for university and society from offering online programs by utilizing advanced technologies. But the programs must not be implemented carelessly. The key to providing a quality online program is the issue of perceived quality, which takes into account the viewpoint of all stakeholders involved. To truly ensure the institutional quality, however, a systemic view of all factors contributing to the quality must be analyzed and linked to one another — allowing education administrators to understand how each factor contributes to the perceived quality of online education. The perceived quality of an online program will be positively reinforced only through an organizational-wide effort that focuses on managed administration, augmenting online program branding, skilled faculty, supportive alumni, student satisfaction, and effective delivery systems — each of which is vital to a quality online program. This study focuses on the concept of quality assurance in the start-up, implementation, and sustainability of online education. A case of online MBA program will be analyzed to explore the quality assurance. The difficulties in promoting online education quality is the fact that universities are complex networks of disciplinary, social, economic, and political fiefdoms, both internal and external factors to the institutions. As such, the system analysis, a systems-thinking approach, on the issue of perceived quality is ideal to investigate the factors and how each factor contributes to the perceived quality in the online education domain.Keywords: systems thinking, quality assurance, online education, MBA program
Procedia PDF Downloads 2385879 A Machine Learning Approach for Detecting and Locating Hardware Trojans
Authors: Kaiwen Zheng, Wanting Zhou, Nan Tang, Lei Li, Yuanhang He
Abstract:
The integrated circuit industry has become a cornerstone of the information society, finding widespread application in areas such as industry, communication, medicine, and aerospace. However, with the increasing complexity of integrated circuits, Hardware Trojans (HTs) implanted by attackers have become a significant threat to their security. In this paper, we proposed a hardware trojan detection method for large-scale circuits. As HTs introduce physical characteristic changes such as structure, area, and power consumption as additional redundant circuits, we proposed a machine-learning-based hardware trojan detection method based on the physical characteristics of gate-level netlists. This method transforms the hardware trojan detection problem into a machine-learning binary classification problem based on physical characteristics, greatly improving detection speed. To address the problem of imbalanced data, where the number of pure circuit samples is far less than that of HTs circuit samples, we used the SMOTETomek algorithm to expand the dataset and further improve the performance of the classifier. We used three machine learning algorithms, K-Nearest Neighbors, Random Forest, and Support Vector Machine, to train and validate benchmark circuits on Trust-Hub, and all achieved good results. In our case studies based on AES encryption circuits provided by trust-hub, the test results showed the effectiveness of the proposed method. To further validate the method’s effectiveness for detecting variant HTs, we designed variant HTs using open-source HTs. The proposed method can guarantee robust detection accuracy in the millisecond level detection time for IC, and FPGA design flows and has good detection performance for library variant HTs.Keywords: hardware trojans, physical properties, machine learning, hardware security
Procedia PDF Downloads 1495878 Control of a Quadcopter Using Genetic Algorithm Methods
Authors: Mostafa Mjahed
Abstract:
This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system
Procedia PDF Downloads 4355877 Geospatial Technologies in Support of Civic Engagement and Cultural Heritage: Lessons Learned from Three Participatory Planning Workshops for Involving Local Communities in the Development of Sustainable Tourism Practices in Latiano, Brindisi
Authors: Mark Opmeer
Abstract:
The fruitful relationship between cultural heritage and digital technology is evident. Due to the development of user-friendly software, an increasing amount of heritage scholars use ict for their research activities. As a result, the implementation of information technology for heritage planning has become a research objective in itself. During the last decades, we have witnessed a growing debate and literature about the importance of computer technologies for the field of cultural heritage and ecotourism. Indeed, implementing digital technology in support of these domains can be very fruitful for one’s research practice. However, due to the rapid development of new software scholars may find it challenging to use these innovations in an appropriate way. As such, this contribution seeks to explore the interplay between geospatial technologies (geo-ict), civic engagement and cultural heritage and tourism. In this article, we discuss our findings on the use of geo-ict in support of civic participation, cultural heritage and sustainable tourism development in the southern Italian district of Brindisi. In the city of Latiano, three workshops were organized that involved local members of the community to distinguish and discuss interesting points of interests (POI’s) which represent the cultural significance and identity of the area. During the first workshop, a so called mappa della comunità was created on a touch table with collaborative mapping software, that allowed the participators to highlight potential destinations for tourist purposes. Furthermore, two heritage-based itineraries along a selection of identified POI’s was created to make the region attractive for recreants and tourists. These heritage-based itineraries reflect the communities’ ideas about the cultural identity of the region. Both trails were subsequently implemented in a dedicated mobile application (app) and was evaluated using a mixed-method approach with the members of the community during the second workshop. In the final workshop, the findings of the collaboration, the heritage trails and the app was evaluated with all participants. Based on our conclusions, we argue that geospatial technologies have a significant potential for involving local communities in heritage planning and tourism development. The participants of the workshops found it increasingly engaging to share their ideas and knowledge using the digital map of the touch table. Secondly, the use of a mobile application as instrument to test the heritage-based itineraries in the field was broadly considered as fun and beneficial for enhancing community awareness and participation in local heritage. The app furthermore stimulated the communities’ awareness of the added value of geospatial technologies for sustainable tourism development in the area. We conclude this article with a number of recommendations in order to provide a best practice for organizing heritage workshops with similar objectives.Keywords: civic engagement, geospatial technologies, tourism development, cultural heritage
Procedia PDF Downloads 2905876 A Modular Reactor for Thermochemical Energy Storage Examination of Ettringite-Based Materials
Authors: B. Chen, F. Kuznik, M. Horgnies, K. Johannes, V. Morin, E. Gengembre
Abstract:
More attention on renewable energy has been done after the achievement of Paris Agreement against climate change. Solar-based technology is supposed to be one of the most promising green energy technologies for residential buildings since its widely thermal usage for hot water and heating. However, the seasonal mismatch between its production and consumption makes buildings need an energy storage system to improve the efficiency of renewable energy use. Indeed, there exist already different kinds of energy storage systems using sensible or latent heat. With the consideration of energy dissipation during storage and low energy density for above two methods, thermochemical energy storage is then recommended. Recently, ettringite (3CaO∙Al₂O₃∙3CaSO₄∙32H₂O) based materials have been reported as potential thermochemical storage materials because of high energy density (~500 kWh/m³), low material cost (700 €/m³) and low storage temperature (~60-70°C), compared to reported salt hydrates like SrBr₂·6H₂O (42 k€/m³, ~80°C), LaCl₃·7H₂O (38 k€/m³, ~100°C) and MgSO₄·7H₂O (5 k€/m³, ~150°C). Therefore, they have the possibility to be largely used in building sector with being coupled to normal solar panel systems. On the other side, the lack in terms of extensive examination leads to poor knowledge on their thermal properties and limit maturity of this technology. The aim of this work is to develop a modular reactor adapting to thermal characterizations of ettringite-based material particles of different sizes. The filled materials in the reactor can be self-compacted vertically to ensure hot air or humid air goes through homogenously. Additionally, quick assembly and modification of reactor, like LEGO™ plastic blocks, make it suitable to distinct thermochemical energy storage material samples with different weights (from some grams to several kilograms). In our case, quantity of stored and released energy, best work conditions and even chemical durability of ettringite-based materials have been investigated.Keywords: dehydration, ettringite, hydration, modular reactor, thermochemical energy storage
Procedia PDF Downloads 1395875 Saving Energy through Scalable Architecture
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
In this paper, we focus on the importance of scalable architecture for data centers and buildings in general to help an enterprise achieve environmental sustainability. The scalable architecture helps in many ways, such as adaptability to the business and user requirements, promotes high availability and disaster recovery solutions that are cost effective and low maintenance. The scalable architecture also plays a vital role in three core areas of sustainability: economy, environment, and social, which are also known as the 3 pillars of a sustainability model. If the architecture is scalable, it has many advantages. A few examples are that scalable architecture helps businesses and industries to adapt to changing technology, drive innovation, promote platform independence, and build resilience against natural disasters. Most importantly, having a scalable architecture helps industries bring in cost-effective measures for energy consumption, reduce wastage, increase productivity, and enable a robust environment. It also helps in the reduction of carbon emissions with advanced monitoring and metering capabilities. Scalable architectures help in reducing waste by optimizing the designs to utilize materials efficiently, minimize resources, decrease carbon footprints by using low-impact materials that are environmentally friendly. In this paper we also emphasize the importance of cultural shift towards the reuse and recycling of natural resources for a balanced ecosystem and maintain a circular economy. Also, since all of us are involved in the use of computers, much of the scalable architecture we have studied is related to data centers.Keywords: scalable architectures, sustainability, application design, disruptive technology, machine learning and natural language processing, AI, social media platform, cloud computing, advanced networking and storage devices, advanced monitoring and metering infrastructure, climate change
Procedia PDF Downloads 1105874 Assessing the Financial Impact of Federal Benefit Program Enrollment on Low-income Households
Authors: Timothy Scheinert, Eliza Wright
Abstract:
Background: Link Health is a Boston-based non-profit leveraging in-person and digital platforms to promote health equity. Its primary aim is to financially support low-income individuals through enrollment in federal benefit programs. This study examines the monetary impact of enrollment in several benefit programs. Methodologies: Approximately 17,000 individuals have been screened for eligibility via digital outreach, community events, and in-person clinics. Enrollment and financial distributions are evaluated across programs, including the Affordable Connectivity Program (ACP), Lifeline, LIHEAP, Transitional Aid to Families with Dependent Children (TAFDC), and the Supplemental Nutrition Assistance Program (SNAP). Major Findings: A total of 1,895 individuals have successfully applied, collectively distributing an estimated $1,288,152.00 in aid. The largest contributors to this sum include: ACP: 1,149 enrollments, $413,640 distributed annually. Child Care Financial Assistance (CCFA): 15 enrollments, $240,000 distributed annually. Lifeline: 602 enrollments, $66,822 distributed annually. LIHEAP: 25 enrollments, $48,750 distributed annually. SNAP: 41 enrollments, $123,000 distributed annually. TAFDC: 21 enrollments, $341,760 distributed annually. Conclusions: These results highlight the role of targeted outreach and effective enrollment processes in promoting access to federal benefit programs. High enrollment rates in ACP and Lifeline demonstrate a considerable need for affordable broadband and internet services. Programs like CCFA and TAFDC, despite lower enrollment numbers, provide sizable support per individual. This analysis advocates for continued funding of federal benefit programs. Future efforts can be made to develop screening tools that identify eligibility for multiple programs and reduce the complexity of enrollment.Keywords: benefits, childcare, connectivity, equity, nutrition
Procedia PDF Downloads 305873 Sustainable Community Education: Strategies for Long-Term Impact
Authors: Kariman Abdelaziz Ahmed Ali Hamzawy
Abstract:
Amidst the growing global challenges facing communities, from climate change to educational gaps, sustainable community education has emerged as a vital tool for ensuring comprehensive and enduring development. This research aims to explore effective strategies for sustainable community education that can lead to long-term impacts on local communities. The study begins by defining the concept of sustainable education within a community context and reviews the current literature on the topic. It then presents case studies from various communities around the world where sustainable educational strategies have been successfully implemented. These case studies illustrate how sustainable education can enhance community engagement, build local capacities, and improve quality of life in sustainable ways. The findings from these studies are analyzed to identify the key factors contributing to the success of sustainable educational programs. These factors include partnerships between different sectors (governmental, private, and community), the innovative use of technology, and the adaptation of educational curricula to meet the unique needs of the community. The research also offers practical recommendations on designing and implementing sustainable educational programs, emphasizing the integration of formal and informal education, promoting lifelong learning, and developing local resources. It addresses potential challenges and ways to overcome them to ensure the long-term sustainability of these programs. In conclusion, the research provides a future vision of the role of sustainable education in building resilient and prosperous communities and highlights the importance of investing in education as a key driver of sustainable development. This study contributes to the ongoing discussion on achieving lasting impact through sustainable community education and offers a practical framework for stakeholders to adopt and implement these strategies.Keywords: sustainable education, community education, Community engagement, local capacity building, educational technology
Procedia PDF Downloads 565872 Decision Making in Medicine and Treatment Strategies
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.Keywords: decision making, medicine, treatment strategies, patient
Procedia PDF Downloads 5815871 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia
Authors: Eyosiyas Aga
Abstract:
The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service
Procedia PDF Downloads 2685870 An Unsupervised Domain-Knowledge Discovery Framework for Fake News Detection
Authors: Yulan Wu
Abstract:
With the rapid development of social media, the issue of fake news has gained considerable prominence, drawing the attention of both the public and governments. The widespread dissemination of false information poses a tangible threat across multiple domains of society, including politics, economy, and health. However, much research has concentrated on supervised training models within specific domains, their effectiveness diminishes when applied to identify fake news across multiple domains. To solve this problem, some approaches based on domain labels have been proposed. By segmenting news to their specific area in advance, judges in the corresponding field may be more accurate on fake news. However, these approaches disregard the fact that news records can pertain to multiple domains, resulting in a significant loss of valuable information. In addition, the datasets used for training must all be domain-labeled, which creates unnecessary complexity. To solve these problems, an unsupervised domain knowledge discovery framework for fake news detection is proposed. Firstly, to effectively retain the multidomain knowledge of the text, a low-dimensional vector for each news text to capture domain embeddings is generated. Subsequently, a feature extraction module utilizing the unsupervisedly discovered domain embeddings is used to extract the comprehensive features of news. Finally, a classifier is employed to determine the authenticity of the news. To verify the proposed framework, a test is conducted on the existing widely used datasets, and the experimental results demonstrate that this method is able to improve the detection performance for fake news across multiple domains. Moreover, even in datasets that lack domain labels, this method can still effectively transfer domain knowledge, which can educe the time consumed by tagging without sacrificing the detection accuracy.Keywords: fake news, deep learning, natural language processing, multiple domains
Procedia PDF Downloads 1015869 Efficient Compact Micro Dielectric Barrier Discharge (DBD) Plasma Reactor for Ozone Generation for Industrial Application in Liquid and Gas Phase Systems
Authors: D. Kuvshinov, A. Siswanto, J. Lozano-Parada, W. Zimmerman
Abstract:
Ozone is well known as a powerful fast reaction rate oxidant. The ozone based processes produce no by-product left as a non-reacted ozone returns back to the original oxygen molecule. Therefore an application of ozone is widely accepted as one of the main directions for a sustainable and clean technologies development. There are number of technologies require ozone to be delivered to specific points of a production network or reactors construction. Due to space constrains, high reactivity and short life time of ozone the use of ozone generators even of a bench top scale is practically limited. This requires development of mini/micro scale ozone generator which can be directly incorporated into production units. Our report presents a feasibility study of a new micro scale rector for ozone generation (MROG). Data on MROG calibration and indigo decomposition at different operation conditions are presented. At selected operation conditions with residence time of 0.25 s the process of ozone generation is not limited by reaction rate and the amount of ozone produced is a function of power applied. It was shown that the MROG is capable to produce ozone at voltage level starting from 3.5kV with ozone concentration of 5.28E-6 (mol/L) at 5kV. This is in line with data presented on numerical investigation for a MROG. It was shown that in compare to a conventional ozone generator, MROG has lower power consumption at low voltages and atmospheric pressure. The MROG construction makes it applicable for emerged and dry systems. With a robust compact design MROG can be used as incorporated unit for production lines of high complexity.Keywords: dielectric barrier discharge (DBD), micro reactor, ozone, plasma
Procedia PDF Downloads 3395868 Interrogating Bishwas: Reimagining a Christian Neighbourhood in Kolkata, India
Authors: Abhijit Dasgupta
Abstract:
This paper explores the everyday lives of the Christians residing in a Bengali Christian neighborhood in Kolkata, termed here as the larger Christian para (para meaning neighborhood in Bengali). Through ethnography and reading of secondary sources, the paper discerns how various Christians across denominations – Protestants, Catholics and Pentecostals implicate the role of bishwas (faith and belief) in their interpersonal neighborhood relations. The paper attempts to capture the role of bishwas in producing, transforming and revising the meaning of 'neighbourhood' and 'neighbours' and puts forward the argument of the neighbourhood as a theological product. By interrogating and interpreting bishwas through everyday theological discussions and reflections, the paper examines and analyses the ways everyday theology becomes an essential source of power and knowledge for the Bengali Christians in reimagining their neighbourhood compared to the nearby Hindu neighbourhoods. Borrowing literature from everyday theology, faith and belief, the paper reads and analyses various interpretations of theological knowledge across denominations to probe the prominence of bishwas within the Christian community and its role in creating a difference in their place of dwelling. The paper argues that the meaning of neighbourhood is revisited through prayers, sermons and biblical verses. At the same time, the divisions and fissures are seen among Protestants and Catholics and also among native Bengali Protestants and non-native Protestant pastors, which informs us about the complexity of theology in constituting everyday life. Thus, the paper addresses theology's role in creating an ethical Christian neighbourhood amidst everyday tensions and hostilities of diverse religious persuasions. At the same time, it looks into the processes through which multiple theological knowledge leads to schism and interdenominational hostilities. By attempting to answer these questions, the paper brings out Christians' negotiation with the neighbourhood.Keywords: anthropology, bishwas, christianity, neighbourhood, theology
Procedia PDF Downloads 905867 Characterization of Chest Pain in Patients Consulting to the Emergency Department of a Health Institution High Level of Complexity during 2014-2015, Medellin, Colombia
Authors: Jorge Iván Bañol-Betancur, Lina María Martínez-Sánchez, María de los Ángeles Rodríguez-Gázquez, Estefanía Bahamonde-Olaya, Ana María Gutiérrez-Tamayo, Laura Isabel Jaramillo-Jaramillo, Camilo Ruiz-Mejía, Natalia Morales-Quintero
Abstract:
Acute chest pain is a distressing sensation between the diaphragm and the base of the neck and it represents a diagnostic challenge for any physician in the emergency department. Objective: To establish the main clinical and epidemiological characteristics of patients who present with chest pain to the emergency department in a private clinic from the city of Medellin, during 2014-2015. Methods: Cross-sectional retrospective observational study. Population and sample were patients who consulted for chest pain in the emergency department who met the eligibility criteria. The information was analyzed in SPSS program vr.21; qualitative variables were described through relative frequencies, and the quantitative through mean and standard deviation or medians according to their distribution in the study population. Results: A total of 231 patients were evaluated, the mean age was 49.5 ± 19.9 years, 56.7% were females. The most frequent pathological antecedents were hypertension 35.5%, diabetes 10,8%, dyslipidemia 10.4% and coronary disease 5.2%. Regarding pain features, in 40.3% of the patients the pain began abruptly, in 38.2% it had a precordial location, for 20% of the cases physical activity acted as a trigger, and 60.6% was oppressive. Costochondritis was the most common cause of chest pain among patients with an established etiologic diagnosis, representing the 18.2%. Conclusions: Although the clinical features of pain reported coincide with the clinical presentation of an acute coronary syndrome, the most common cause of chest pain in study population was costochondritis instead, indicating that it is a differential diagnostic in the approach of patients with pain acute chest.Keywords: acute coronary syndrome, chest pain, epidemiology, osteochondritis
Procedia PDF Downloads 3445866 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods
Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo
Abstract:
The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines
Procedia PDF Downloads 6225865 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa
Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam
Abstract:
Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines
Procedia PDF Downloads 5165864 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 245863 Narrative Constructs and Environmental Engagement: A Textual Analysis of Climate Fiction’s Role in Shaping Sustainability Consciousness
Authors: Dean J. Hill
Abstract:
This paper undertakes the task of conducting an in-depth textual analysis of the cli-fi genre. It examines how writing in the genre contributes to expressing and facilitating the articulation of environmental consciousness through the form of narrative. The paper begins by situating cli-fi within the literary continuum of ecological narratives and identifying the unique textual characteristics and thematic preoccupations of this area. The paper unfolds how cli-fi transforms the esoteric nature of climate science into credible narrative forms by drawing on language use, metaphorical constructs, and narrative framing. It also involves how descriptive and figurative language in the description of nature and disaster makes climate change so vivid and emotionally resonant. The work also points out the dialogic nature of cli-fi, whereby the characters and the narrators experience inner disputes in the novel regarding the ethical dilemma of environmental destruction, thus demanding the readers challenge and re-evaluate their standpoints on sustainability and ecological responsibilities. The paper proceeds with analysing the feature of narrative voice and its role in eliciting empathy, as well as reader involvement with the ecological material. In looking at how different narratorial perspectives contribute to the emotional and cognitive reaction of the reader to text, this study demonstrates the profound power of perspective in developing intimacy with the dominating concerns. Finally, the emotional arc of cli-fi narratives, running its course over themes of loss, hope, and resilience, is analysed in relation to how these elements function to marshal public feeling and discourse into action around climate change. Therefore, we can say that the complexity of the text in the cli-fi not only shows the hard edge of the reality of climate change but also influences public perception and behaviour toward a more sustainable future.Keywords: cli-fi genre, ecological narratives, emotional arc, narrative voice, public perception
Procedia PDF Downloads 325862 Isolation of Soil Thiobacterii and Determination of Their Bio-Oxidation Activity
Authors: A. Kistaubayeva, I. Savitskaya, D. Ibrayeva, M. Abdulzhanova, N. Voronova
Abstract:
36 strains of sulfur-oxidizing bacteria were isolated in Southern Kazakhstan soda-saline soils and identified. Screening of strains according bio-oxidation (destruction thiosulfate to sulfate) and enzymatic (Thiosulfate dehydrogenises and thiosulfate reductase) activity was conducted. There were selected modes of aeration and culture conditions (pH, temperature), which provide optimum harvest cells. These strains can be used in bio-melioration technology.Keywords: elemental sulfur, oxidation activity, Тhiobacilli, fertilizers, heterotrophic S-oxidizers
Procedia PDF Downloads 385