Search results for: analyzing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2197

Search results for: analyzing

67 Early Melt Season Variability of Fast Ice Degradation Due to Small Arctic Riverine Heat Fluxes

Authors: Grace E. Santella, Shawn G. Gallaher, Joseph P. Smith

Abstract:

In order to determine the importance of small-system riverine heat flux on regional landfast sea ice breakup, our study explores the annual spring freshet of the Sagavanirktok River from 2014-2019. Seasonal heat cycling ultimately serves as the driving mechanism behind the freshet; however, as an emerging area of study, the extent to which inland thermodynamics influence coastal tundra geomorphology and connected landfast sea ice has not been extensively investigated in relation to small-scale Arctic river systems. The Sagavanirktok River is a small-to-midsized river system that flows south-to-north on the Alaskan North Slope from the Brooks mountain range to the Beaufort Sea at Prudhoe Bay. Seasonal warming in the spring rapidly melts snow and ice in a northwards progression from the Brooks Range and transitional tundra highlands towards the coast and when coupled with seasonal precipitation, results in a pulsed freshet that propagates through the Sagavanirktok River. The concentrated presence of newly exposed vegetation in the transitional tundra region due to spring melting results in higher absorption of solar radiation due to a lower albedo relative to snow-covered tundra and/or landfast sea ice. This results in spring flood runoff that advances over impermeable early-season permafrost soils with elevated temperatures relative to landfast sea ice and sub-ice flow. We examine the extent to which interannual temporal variability influences the onset and magnitude of river discharge by analyzing field measurements from the United States Geological Survey (USGS) river and meteorological observation sites. Rapid influx of heat to the Arctic Ocean via riverine systems results in a noticeable decay of landfast sea ice independent of ice breakup seaward of the shear zone. Utilizing MODIS imagery from NASA’s Terra satellite, interannual variability of river discharge is visualized, allowing for optical validation that the discharge flow is interacting with landfast sea ice. Thermal erosion experienced by sediment fast ice at the arrival of warm overflow preconditions the ice regime for rapid thawing. We investigate the extent to which interannual heat flux from the Sagavanirktok River’s freshet significantly influences the onset of local landfast sea ice breakup. The early-season warming of atmospheric temperatures is evidenced by the presence of storms which introduce liquid, rather than frozen, precipitation into the system. The resultant decreased albedo of the transitional tundra supports the positive relationship between early-season precipitation events, inland thermodynamic cycling, and degradation of landfast sea ice. Early removal of landfast sea ice increases coastal erosion in these regions and has implications for coastline geomorphology which stress industrial, ecological, and humanitarian infrastructure.

Keywords: Albedo, freshet, landfast sea ice, riverine heat flux, seasonal heat cycling

Procedia PDF Downloads 105
66 Analyzing the Heat Transfer Mechanism in a Tube Bundle Air-PCM Heat Exchanger: An Empirical Study

Authors: Maria De Los Angeles Ortega, Denis Bruneau, Patrick Sebastian, Jean-Pierre Nadeau, Alain Sommier, Saed Raji

Abstract:

Phase change materials (PCM) present attractive features that made them a passive solution for thermal comfort assessment in buildings during summer time. They show a large storage capacity per volume unit in comparison with other structural materials like bricks or concrete. If their use is matched with the peak load periods, they can contribute to the reduction of the primary energy consumption related to cooling applications. Despite these promising characteristics, they present some drawbacks. Commercial PCMs, as paraffines, offer a low thermal conductivity affecting the overall performance of the system. In some cases, the material can be enhanced, adding other elements that improve the conductivity, but in general, a design of the unit that optimizes the thermal performance is sought. The material selection is the departing point during the designing stage, and it does not leave plenty of room for optimization. The PCM melting point depends highly on the atmospheric characteristics of the building location. The selection must relay within the maximum, and the minimum temperature reached during the day. The geometry of the PCM container and the geometrical distribution of these containers are designing parameters, as well. They significantly affect the heat transfer, and therefore its phenomena must be studied exhaustively. During its lifetime, an air-PCM unit in a building must cool down the place during daytime, while the melting of the PCM occurs. At night, the PCM must be regenerated to be ready for next uses. When the system is not in service, a minimal amount of thermal exchanges is desired. The aforementioned functions result in the presence of sensible and latent heat storage and release. Hence different types of mechanisms drive the heat transfer phenomena. An experimental test was designed to study the heat transfer phenomena occurring in a circular tube bundle air-PCM exchanger. An in-line arrangement was selected as the geometrical distribution of the containers. With the aim of visual identification, the containers material and a section of the test bench were transparent. Some instruments were placed on the bench for measuring temperature and velocity. The PCM properties were also available through differential scanning calorimeter (DSC) tests. An evolution of the temperature during both cycles, melting and solidification were obtained. The results showed some phenomena at a local level (tubes) and on an overall level (exchanger). Conduction and convection appeared as the main heat transfer mechanisms. From these results, two approaches to analyze the heat transfer were followed. The first approach described the phenomena in a single tube as a series of thermal resistances, where a pure conduction controlled heat transfer was assumed in the PCM. For the second approach, the temperature measurements were used to find some significant dimensionless numbers and parameters as Stefan, Fourier and Rayleigh numbers, and the melting fraction. These approaches allowed us to identify the heat transfer phenomena during both cycles. The presence of natural convection during melting might have been stated from the influence of the Rayleigh number on the correlations obtained.

Keywords: phase change materials, air-PCM exchangers, convection, conduction

Procedia PDF Downloads 154
65 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra

Authors: Bitewulign Mekonnen

Abstract:

Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.

Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network

Procedia PDF Downloads 63
64 The Sense of Recognition of Muslim Women in Western Academia

Authors: Naima Mohammadi

Abstract:

The present paper critically reports on the emergency of Iranian international students in a large public university in Italy. Although the most sizeable diaspora of Iranians dates back to the 1979 revolution, a huge wave of Iranian female students travelled abroad after the Iranian Green Movement (2009) due to the intensification of gender discrimination and Islamization. To explore the experience of Iranian female students at an Italian public university, two complementary methods were adopted: a focus group and individual interviews. Focus groups yield detailed collective conversations and provide researchers with an opportunity to observe the interaction between participants, rather than between participant and researcher, which generates data. Semi-structured interviews allow participants to share their stories in their own words and speak about personal experiences and opinions. Research participants were invited to participate through a public call in a Telegram group of Iranian students. Theoretical and purposive sampling was applied to select participants. All participants were assured that full anonymity would be ensured and they consented to take part in the research. A two-hour focus group was held in English with participants in the presence and some online. They were asked to share their motivations for studying in Italy and talk about their experiences both within and outside the university context. Each of these interviews lasted from 45 to 60 minutes and was mostly carried out online and in Farsi. The focus group consisted of 8 Iranian female post-graduate students. In analyzing the data a blended approach was adopted, with a combination of deductive and inductive coding. According to research findings, although 9/11 was the beginning of the West’s challenges against Muslims, the nuclear threats of Islamic regimes promoted the toughest international sanctions against Iranians as a nation across the world. Accordingly, carrying an Iranian identity contributes to social, political, and economic exclusion. Research findings show that geopolitical factors such as international sanctions and Islamophobia, and a lack of reciprocity in terms of recognition, have created a sense of stigmatization for veiled and unveiled Iranian female students who are the largest groups of ‘non-European Muslim international students’ enrolled in Italian universities. Participants addressed how their nationality has devalued their public image and negatively impacted their self-confidence and self-realization in academia. They highlighted the experience of an unwelcoming atmosphere by different groups of people and institutes, such as receiving marked students’ badges, rejected bank account requests, failed visa processes, secondary security screening selection, and hyper-visibility of veiled students. This study corroborates the need for institutions to pay attention to geopolitical factors and religious diversity in student recruitment and provide support mechanisms and access to basic rights. Accordingly, it is suggested that Higher Education Institutions (HEIs) have a social and moral responsibility towards the discrimination and both social and academic exclusion of Iranian students.

Keywords: Iranian diaspora, female students, recognition theory, inclusive university

Procedia PDF Downloads 38
63 Spatio-Temporal Dynamic of Woody Vegetation Assessment Using Oblique Landscape Photographs

Authors: V. V. Fomin, A. P. Mikhailovich, E. M. Agapitov, V. E. Rogachev, E. A. Kostousova, E. S. Perekhodova

Abstract:

Ground-level landscape photos can be used as a source of objective data on woody vegetation and vegetation dynamics. We proposed a method for processing, analyzing, and presenting ground photographs, which has the following advantages: 1) researcher has to form holistic representation of the study area in form of a set of interlapping ground-level landscape photographs; 2) it is necessary to define or obtain characteristics of the landscape, objects, and phenomena present on the photographs; 3) it is necessary to create new or supplement existing textual descriptions and annotations for the ground-level landscape photographs; 4) single or multiple ground-level landscape photographs can be used to develop specialized geoinformation layers, schematic maps or thematic maps; 5) it is necessary to determine quantitative data that describes both images as a whole, and displayed objects and phenomena, using algorithms for automated image analysis. It is suggested to match each photo with a polygonal geoinformation layer, which is a sector consisting of areas corresponding with parts of the landscape visible in the photos. Calculation of visibility areas is performed in a geoinformation system within a sector using a digital model of a study area relief and visibility analysis functions. Superposition of the visibility sectors corresponding with various camera viewpoints allows matching landscape photos with each other to create a complete and wholesome representation of the space in question. It is suggested to user-defined data or phenomenons on the images with the following superposition over the visibility sector in the form of map symbols. The technology of geoinformation layers’ spatial superposition over the visibility sector creates opportunities for image geotagging using quantitative data obtained from raster or vector layers within the sector with the ability to generate annotations in natural language. The proposed method has proven itself well for relatively open and clearly visible areas with well-defined relief, for example, in mountainous areas in the treeline ecotone. When the polygonal layers of visibility sectors for a large number of different points of photography are topologically superimposed, a layer of visibility of sections of the entire study area is formed, which is displayed in the photographs. Also, as a result of this overlapping of sectors, areas that did not appear in the photo will be assessed as gaps. According to the results of this procedure, it becomes possible to obtain information about the photos that display a specific area and from which points of photography it is visible. This information may be obtained either as a query on the map or as a query for the attribute table of the layer. The method was tested using repeated photos taken from forty camera viewpoints located on Ray-Iz mountain massif (Polar Urals, Russia) from 1960 until 2023. It has been successfully used in combination with other ground-based and remote sensing methods of studying the climate-driven dynamics of woody vegetation in the Polar Urals. Acknowledgment: This research was collaboratively funded by the Russian Ministry for Science and Education project No. FEUG-2023-0002 (image representation) and Russian Science Foundation project No. 24-24-00235 (automated textual description).

Keywords: woody, vegetation, repeated, photographs

Procedia PDF Downloads 29
62 A Constructionist View of Projects, Social Media and Tacit Knowledge in a College Classroom: An Exploratory Study

Authors: John Zanetich

Abstract:

Designing an educational activity that encourages inquiry and collaboration is key to engaging students in meaningful learning. Educational Information and Communications Technology (EICT) plays an important role in facilitating cooperative and collaborative learning in the classroom. The EICT also facilitates students’ learning and development of the critical thinking skills needed to solve real world problems. Projects and activities based on constructivism encourage students to embrace complexity as well as find relevance and joy in their learning. It also enhances the students’ capacity for creative and responsible real-world problem solving. Classroom activities based on constructivism offer students an opportunity to develop the higher–order-thinking skills of defining problems and identifying solutions. Participating in a classroom project is an activity for both acquiring experiential knowledge and applying new knowledge to practical situations. It also provides an opportunity for students to integrate new knowledge into a skill set using reflection. Classroom projects can be developed around a variety of learning objects including social media, knowledge management and learning communities. The construction of meaning through project-based learning is an approach that encourages interaction and problem-solving activities. Projects require active participation, collaboration and interaction to reach the agreed upon outcomes. Projects also serve to externalize the invisible cognitive and social processes taking place in the activity itself and in the student experience. This paper describes a classroom project designed to elicit interactions by helping students to unfreeze existing knowledge, to create new learning experiences, and then refreeze the new knowledge. Since constructivists believe that students construct their own meaning through active engagement and participation as well as interactions with others. knowledge management can be used to guide the exchange of both tacit and explicit knowledge in interpersonal interactions between students and guide the construction of meaning. This paper uses an action research approach to the development of a classroom project and describes the use of technology, social media and the active use of tacit knowledge in the college classroom. In this project, a closed group Facebook page becomes the virtual classroom where interaction is captured and measured using engagement analytics. In the virtual learning community, the principles of knowledge management are used to identify the process and components of the infrastructure of the learning process. The project identifies class member interests and measures student engagement in a learning community by analyzing regular posting on the Facebook page. These posts are used to foster and encourage interactions, reflect a student’s interest and serve as reaction points from which viewers of the post convert the explicit information in the post to implicit knowledge. The data was collected over an academic year and was provided, in part, by the Google analytic reports on Facebook and self-reports of posts by members. The results support the use of active tacit knowledge activities, knowledge management and social media to enhance the student learning experience and help create the knowledge that will be used by students to construct meaning.

Keywords: constructivism, knowledge management, tacit knowledge, social media

Procedia PDF Downloads 195
61 Toward the Destigmatizing the Autism Label: Conceptualizing Celebratory Technologies

Authors: LouAnne Boyd

Abstract:

From the perspective of self-advocates, the biggest unaddressed problem is not the symptoms of an autism spectrum diagnosis but the social stigma that accompanies autism. This societal perspective is in contrast to the focus on the majority of interventions. Autism interventions, and consequently, most innovative technologies for autism, aim to improve deficits that occur within the person. For example, the most common Human-Computer Interaction research projects in assistive technology for autism target social skills from a normative perspective. The premise of the autism technologies is that difficulties occur inside the body, hence, the medical model focuses on ways to improve the ailment within the person. However, other technological approaches to support people with autism do exist. In the realm of Human Computer Interaction, there are other modes of research that provide critique of the medical model. For example, critical design, whose intended audience is industry or other HCI researchers, provides products that are the opposite of interventionist work to bring attention to the misalignment between the lived experience and the societal perception of autism. For example, parodies of interventionist work exist to provoke change, such as a recent project called Facesavr, a face covering that helps allistic adults be more independent in their emotional processing. Additionally, from a critical disability studies’ perspective, assistive technologies perpetuate harmful normalizing behaviors. However, these critical approaches can feel far from the frontline in terms of taking direct action to positively impact end users. From a critical yet more pragmatic perspective, projects such as Counterventions lists ways to reduce the likelihood of perpetuating ableism in interventionist’s work by reflectively analyzing a series of evolving assistive technology projects through a societal lens, thus leveraging the momentum of the evolving ecology of technologies for autism. Therefore, all current paradigms fall short of addressing the largest need—the negative impact of social stigma. The current work introduces a new paradigm for technologies for autism, borrowing from a paradigm introduced two decades ago around changing the narrative related to eating disorders. It is the shift from reprimanding poor habits to celebrating positive aspects of eating. This work repurposes Celebratory Technology for Neurodiversity and intended to reduce social stigma by targeting for the public at large. This presentation will review how requirements were derived from current research on autism social stigma as well as design sessions with autistic adults. Congruence between these two sources revealed three key design implications for technology: provide awareness of the autistic experience; generate acceptance of the neurodivergence; cultivate an appreciation for talents and accomplishments of neurodivergent people. The current pilot work in Celebratory Technology offers a new paradigm for supporting autism by shifting the burden of change from the person with autism to address changing society’s biases at large. Shifting the focus of research outside of the autistic body creates a new space for a design that extends beyond the bodies of a few and calls on all to embrace humanity as a whole.

Keywords: neurodiversity, social stigma, accessibility, inclusion, celebratory technology

Procedia PDF Downloads 44
60 Exploring the Neural Mechanisms of Communication and Cooperation in Children and Adults

Authors: Sara Mosteller, Larissa K. Samuelson, Sobanawartiny Wijeakumar, John P. Spencer

Abstract:

This study was designed to examine how humans are able to teach and learn semantic information as well as cooperate in order to jointly achieve sophisticated goals. Specifically, we are measuring individual differences in how these abilities develop from foundational building blocks in early childhood. The current study adopts a paradigm for novel noun learning developed by Samuelson, Smith, Perry, and Spencer (2011) to a hyperscanning paradigm [Cui, Bryant and Reiss, 2012]. This project measures coordinated brain activity between a parent and child using simultaneous functional near infrared spectroscopy (fNIRS) in pairs of 2.5, 3.5 and 4.5-year-old children and their parents. We are also separately testing pairs of adult friends. Children and parents, or adult friends, are seated across from one another at a table. The parent (in the developmental study) then teaches their child the names of novel toys. An experimenter then tests the child by presenting the objects in pairs and asking the child to retrieve one object by name. Children are asked to choose from both pairs of familiar objects and pairs of novel objects. In order to explore individual differences in cooperation with the same participants, each dyad plays a cooperative game of Jenga, in which their joint score is based on how many blocks they can remove from the tower as a team. A preliminary analysis of the noun-learning task showed that, when presented with 6 word-object mappings, children learned an average of 3 new words (50%) and that the number of objects learned by each child ranged from 2-4. Adults initially learned all of the new words but were variable in their later retention of the mappings, which ranged from 50-100%. We are currently examining differences in cooperative behavior during the Jenga playing game, including time spent discussing each move before it is made. Ongoing analyses are examining the social dynamics that might underlie the differences between words that were successfully learned and unlearned words for each dyad, as well as the developmental differences observed in the study. Additionally, the Jenga game is being used to better understand individual and developmental differences in social coordination during a cooperative task. At a behavioral level, the analysis maps periods of joint visual attention between participants during the word learning and the Jenga game, using head-mounted eye trackers to assess each participant’s first-person viewpoint during the session. We are also analyzing the coherence in brain activity between participants during novel word-learning and Jenga playing. The first hypothesis is that visual joint attention during the session will be positively correlated with both the number of words learned and with the number of blocks moved during Jenga before the tower falls. The next hypothesis is that successful communication of new words and success in the game will each be positively correlated with synchronized brain activity between the parent and child/the adult friends in cortical regions underlying social cognition, semantic processing, and visual processing. This study probes both the neural and behavioral mechanisms of learning and cooperation in a naturalistic, interactive and developmental context.

Keywords: communication, cooperation, development, interaction, neuroscience

Procedia PDF Downloads 229
59 Flood Early Warning and Management System

Authors: Yogesh Kumar Singh, T. S. Murugesh Prabhu, Upasana Dutta, Girishchandra Yendargaye, Rahul Yadav, Rohini Gopinath Kale, Binay Kumar, Manoj Khare

Abstract:

The Indian subcontinent is severely affected by floods that cause intense irreversible devastation to crops and livelihoods. With increased incidences of floods and their related catastrophes, an Early Warning System for Flood Prediction and an efficient Flood Management System for the river basins of India is a must. Accurately modeled hydrological conditions and a web-based early warning system may significantly reduce economic losses incurred due to floods and enable end users to issue advisories with better lead time. This study describes the design and development of an EWS-FP using advanced computational tools/methods, viz. High-Performance Computing (HPC), Remote Sensing, GIS technologies, and open-source tools for the Mahanadi River Basin of India. The flood prediction is based on a robust 2D hydrodynamic model, which solves shallow water equations using the finite volume method. Considering the complexity of the hydrological modeling and the size of the basins in India, it is always a tug of war between better forecast lead time and optimal resolution at which the simulations are to be run. High-performance computing technology provides a good computational means to overcome this issue for the construction of national-level or basin-level flash flood warning systems having a high resolution at local-level warning analysis with a better lead time. High-performance computers with capacities at the order of teraflops and petaflops prove useful while running simulations on such big areas at optimum resolutions. In this study, a free and open-source, HPC-based 2-D hydrodynamic model, with the capability to simulate rainfall run-off, river routing, and tidal forcing, is used. The model was tested for a part of the Mahanadi River Basin (Mahanadi Delta) with actual and predicted discharge, rainfall, and tide data. The simulation time was reduced from 8 hrs to 3 hrs by increasing CPU nodes from 45 to 135, which shows good scalability and performance enhancement. The simulated flood inundation spread and stage were compared with SAR data and CWC Observed Gauge data, respectively. The system shows good accuracy and better lead time suitable for flood forecasting in near-real-time. To disseminate warning to the end user, a network-enabled solution is developed using open-source software. The system has query-based flood damage assessment modules with outputs in the form of spatial maps and statistical databases. System effectively facilitates the management of post-disaster activities caused due to floods, like displaying spatial maps of the area affected, inundated roads, etc., and maintains a steady flow of information at all levels with different access rights depending upon the criticality of the information. It is designed to facilitate users in managing information related to flooding during critical flood seasons and analyzing the extent of the damage.

Keywords: flood, modeling, HPC, FOSS

Procedia PDF Downloads 68
58 Modern Hybrid of Older Black Female Stereotypes in Hollywood Film

Authors: Frederick W. Gooding, Jr., Mark Beeman

Abstract:

Nearly a century ago, the groundbreaking 1915 film ‘The Birth of a Nation’ popularized the way Hollywood made movies with its avant-garde, feature-length style. The movie's subjugating and demeaning depictions of African American women (and men) reflected popular racist beliefs held during the time of slavery and the early Jim Crow era. Although much has changed concerning race relations in the past century, American sociologist Patricia Hill Collins theorizes that the disparaging images of African American women originating in the era of plantation slavery are adaptable and endure as controlling images today. In this context, a comparative analysis of the successful contemporary film, ‘Bringing Down the House’ starring Queen Latifah is relevant as this 2004 film was designed to purposely defy and ridicule classic stereotypes of African American women. However, the film is still tied to the controlling images from the past, although in a modern hybrid form. Scholars of race and film have noted that the pervasive filmic imagery of the African American woman as the loyal mammy stereotype faded from the screen in the post-civil rights era in favor of more sexualized characters (i.e., the Jezebel trope). Analyzing scenes and dialogue through the lens of sociological and critical race theory, the troubling persistence of African American controlling images in film stubbornly emerge in a movie like ‘Bringing Down the House.’ Thus, these controlling images, like racism itself, can adapt to new social and economic conditions. Although the classic controlling images appeared in the first feature length film focusing on race relations a century ago, ‘The Birth of a Nation,’ this black and white rendition of the mammy figure was later updated in 1939 with the classic hit, ‘Gone with the Wind’ in living color. These popular controlling images have loomed quite large in the minds of international audiences, as ‘Gone with the Wind’ is still shown in American theaters currently, and experts at the British Film Institute in 2004 rated ‘Gone with the Wind’ as the number one movie of all time in UK movie history based upon the total number of actual viewings. Critical analysis of character patterns demonstrate that images that appear superficially benign contribute to a broader and quite persistent pattern of marginalization within the aggregate. This approach allows experts and viewers alike to detect more subtle and sophisticated strands of racial discrimination that are ‘hidden in plain sight’ despite numerous changes in the Hollywood industry that appear to be more voluminous and diverse than three or four decades ago. In contrast to white characters, non-white or minority characters are likely to be subtly compromised or marginalized relative to white characters if and when seen within mainstream movies, rather than be subjected to obvious and offensive racist tropes. The hybrid form of both the older Jezebel and Mammy stereotypes exhibited by lead actress Queen Latifah in ‘Bringing Down the House’ represents a more suave and sophisticated merging of past imagery ideas deemed problematic in the past as well as the present.

Keywords: African Americans, Hollywood film, hybrid, stereotypes

Procedia PDF Downloads 154
57 Remote Sensing of Urban Land Cover Change: Trends, Driving Forces, and Indicators

Authors: Wei Ji

Abstract:

This study was conducted in the Kansas City metropolitan area of the United States, which has experienced significant urban sprawling in recent decades. The remote sensing of land cover changes in this area spanned over four decades from 1972 through 2010. The project was implemented in two stages: the first stage focused on detection of long-term trends of urban land cover change, while the second one examined how to detect the coupled effects of human impact and climate change on urban landscapes. For the first-stage study, six Landsat images were used with a time interval of about five years for the period from 1972 through 2001. Four major land cover types, built-up land, forestland, non-forest vegetation land, and surface water, were mapped using supervised image classification techniques. The study found that over the three decades the built-up lands in the study area were more than doubled, which was mainly at the expense of non-forest vegetation lands. Surprisingly and interestingly, the area also saw a significant gain in surface water coverage. This observation raised questions: How have human activities and precipitation variation jointly impacted surface water cover during recent decades? How can we detect such coupled impacts through remote sensing analysis? These questions led to the second stage of the study, in which we designed and developed approaches to detecting fine-scale surface waters and analyzing coupled effects of human impact and precipitation variation on the waters. To effectively detect urban landscape changes that might be jointly shaped by precipitation variation, our study proposed “urban wetscapes” (loosely-defined urban wetlands) as a new indicator for remote sensing detection. The study examined whether urban wetscape dynamics was a sensitive indicator of the coupled effects of the two driving forces. To better detect this indicator, a rule-based classification algorithm was developed to identify fine-scale, hidden wetlands that could not be appropriately detected based on their spectral differentiability by a traditional image classification. Three SPOT images for years 1992, 2008, and 2010, respectively were classified with this technique to generate the four types of land cover as described above. The spatial analyses of remotely-sensed wetscape changes were implemented at the scales of metropolitan, watershed, and sub-watershed, as well as based on the size of surface water bodies in order to accurately reveal urban wetscape change trends in relation to the driving forces. The study identified that urban wetscape dynamics varied in trend and magnitude from the metropolitan, watersheds, to sub-watersheds in response to human impacts at different scales. The study also found that increased precipitation in the region in the past decades swelled larger wetlands in particular while generally smaller wetlands decreased mainly due to human development activities. These results confirm that wetscape dynamics can effectively reveal the coupled effects of human impact and climate change on urban landscapes. As such, remote sensing of this indicator provides new insights into the relationships between urban land cover changes and driving forces.

Keywords: urban land cover, human impact, climate change, rule-based classification, across-scale analysis

Procedia PDF Downloads 291
56 The Istrian Istrovenetian-Croatian Bilingual Corpus

Authors: Nada Poropat Jeletic, Gordana Hrzica

Abstract:

Bilingual conversational corpora represent a meaningful and the most comprehensive data source for investigating the genuine contact phenomena in non-monitored bi-lingual speech productions. They can be particularly useful for bilingual research since some features of bilingual interaction can hardly be accessed with more traditional methodologies (e.g., elicitation tasks). The method of language sampling provides the resources for describing language interaction in a bilingual community and/or in bilingual situations (e.g. code-switching, amount of languages used, number of languages used, etc.). To capture these phenomena in genuine communication situations, such sampling should be as close as possible to spontaneous communication. Bilingual spoken corpus design is methodologically demanding. Therefore this paper aims at describing the methodological challenges that apply to the corpus design of the conversational corpus design of the Istrian Istrovenetian-Croatian Bilingual Corpus. Croatian is the first official language of the Croatian-Italian officially bilingual Istria County, while Istrovenetian is a diatopic subvariety of Venetian, a longlasting lingua franca in the Istrian peninsula, the mother tongue of the members of the Italian National Community in Istria and the primary code of informal everyday communication among the Istrian Italophone population. Within the CLARIN infrastructure, TalkBank is being used, as it provides relevant procedures for designing and analyzing bilingual corpora. Furthermore, it allows public availability allows for easy replication of studies and cumulative progress as a research community builds up around the corpus, while the tools developed within the field of corpus linguistics enable easy retrieval and analysis of information. The method of language sampling employed is kept at the level of spontaneous communication, in order to maximise the naturalness of the collected conversational data. All speakers have provided written informed consent in which they agree to be recorded at a random point within the period of one month after signing the consent. Participants are administered a background questionnaire providing information about the socioeconomic status and the exposure and language usage in the participants social networks. Recording data are being transcribed, phonologically adapted within a standard-sized orthographic form, coded and segmented (speech streams are being segmented into communication units based on syntactic criteria) and are being marked following the CHAT transcription system and its associated CLAN suite of programmes within the TalkBank toolkit. The corpus consists of transcribed sound recordings of 36 bilingual speakers, while the target is to publish the whole corpus by the end of 2020, by sampling spontaneous conversations among approximately 100 speakers from all the bilingual areas of Istria for ensuring representativeness (the participants are being recruited across three generations of native bilingual speakers in all the bilingual areas of the peninsula). Conversational corpora are still rare in TalkBank, so the Corpus will contribute to BilingBank as a highly relevant and scientifically reliable resource for an internationally established and active research community. The impact of the research of communities with societal bilingualism will contribute to the growing body of research on bilingualism and multilingualism, especially regarding topics of language dominance, language attrition and loss, interference and code-switching etc.

Keywords: conversational corpora, bilingual corpora, code-switching, language sampling, corpus design methodology

Procedia PDF Downloads 114
55 Official Seals on the Russian-Qing Treaties: Material Manifestations and Visual Enunciations

Authors: Ning Chia

Abstract:

Each of the three different language texts (Manchu, Russian, and Latin) of the 1689 Treaty of Nerchinsk bore official seals from Imperial Russia and Qing China. These seals have received no academic attention, yet they can reveal a site of a layered and shared material, cultural, political, and diplomatic world of the time in Eastern Eurasia. The very different seal selections from both empires while ratifying the Treaty of Beijing in 1860 have obtained no scholarly advertency either; they can also explicate a tremendously changed relationship with visual and material manifestation. Exploring primary sources in Manchu, Russian, and Chinese languages as well as the images of the visual seals, this study investigates the reasons and purposes of utilizing official seals for the treaty agreement. A refreshed understanding of Russian-Qing diplomacy will be developed by pursuing the following aspects: (i) Analyzing the iconographic meanings of each seal insignia and unearthing a competitive, yet symbols-delivered and seal-generated, 'dialogue' between the two empires (ii) Contextualizing treaty seals within the historical seal cultures, and discovering how domestic seal system in each empire’s political institution developed into treaty-defined bilateral relations (iii) Expounding the seal confiding in each empire’s daily governing routines, and annotating the trust in the seal as a quested promise from the opponent negotiator to fulfill the treaty terms (iv) Contrasting the two seal traditions along two civilization-lines, Eastern vs. Western, and dissecting how the two styles of seal emblems affected the cross-cultural understanding or misunderstanding between the two empires (v) Comprehending the history-making events from the substantial resources such as the treaty seals, and grasping why the seals for the two treaties, so different in both visual design and symbolic value, were chosen in the two relationship eras (vi) Correlating the materialized seal 'expression' and the imperial worldviews based on each empire’s national/or power identity, and probing the seal-represented 'rule under the Heaven' assumption of China and Russian rising role in 'European-American imperialism … centered on East Asia' (Victor Shmagin, 2020). In conclusion, the impact of official seals on diplomatic treaties needs profound knowledge in seal history, insignia culture, and emblem belief to be able to comprehend. The official seals in both Imperial Russia and Qing China belonged to a particular statecraft art in a specific material and visual form. Once utilized in diplomatic treaties, the meticulously decorated and politically institutionalized seals were transformed from the determinant means for domestic administration and social control into the markers of an empire’s sovereign authority. Overlooked in historical practice, the insignia seal created a wire of 'visual contest' between the two rival powers. Through this material lens, the scholarly knowledge of the Russian-Qing diplomatic relationship will be significantly upgraded. Connecting Russian studies, Qing/Chinese studies, and Eurasian studies, this study also ties material culture, political culture, and diplomatic culture together. It promotes the study of official seals and emblem symbols in worldwide diplomatic history.

Keywords: Russia-Qing diplomatic relation, Treaty of Beijing (1860), Treaty of Nerchinsk (1689), Treaty seals

Procedia PDF Downloads 189
54 The Knowledge, Attitude, and Practice About Health Information Technology Among First-Generation Muslim Immigrant Women in Atlanta City During the Pandemic

Authors: Awatef Ahmed Ben Ramadan, Aqsa Arshad

Abstract:

Background: There is a huge Muslim migration movement to North America and Europe for several reasons, primarily refuge from war areas and partly to search for better work and educational chances. There are always concerns regarding first-Generation Immigrant women's health and computer literacy, an adequate understanding of the health systems, and the use of the existing healthcare technology and services effectively and efficiently. Language proficiency level, preference for cultural and traditional remedies, socioeconomic factors, fear of stereotyping, limited accessibility to health services, and general unfamiliarity with the existing health services and resources are familiar variables among these women. Aims: The current study aims to assess the health and digital literacy of first-generation Muslim women in Atlanta city. Also, the study aims to examine how the COVID-19 pandemic has encouraged the use of health information technology and increased technology awareness among the targeted women. Methods: The study design is cross-sectional correlational research. The study will be conducted to produce preliminary results that the investigators want to have to supplement an NIH grant application about leveraging information technology to reduce the health inequalities amongst the first-generation immigrant Muslim women in Atlanta City. The investigators will collect the study data in two phases using different tools. Phase one was conducted in June 2022; the investigators used tools to measure health and digital literacy amongst 42 first-generation immigrant Muslim women. Phase two was conducted in November 2022; the investigators measured the Knowledge, Attitude, and Practice (KAP) of using health information technology such as telehealth from a sample of 45 first-generation Muslim immigrant women in Atlanta; in addition, the investigators measured how the current pandemic has affected their KAP to use telemedicine and telehealth services. Both phases' study participants were recruited using convenience sampling methodology. The investigators collected around 40 of 18 years old or older first-generation Muslim immigrant women for both study phases. The study excluded Immigrants who hold work visas and second-generation immigrants. Results: At the point of submitting this abstract, the investigators are still analyzing the study data to produce preliminary results to apply for an NIH grant entitled "Leveraging Health Information Technology (Health IT) to Address and Reduce Health Care Disparities (R01 Clinical Trial Optional)". This research will be the first step of a comprehensive research project to assess and measure health and digital literacy amongst a vulnerable community group. The targeted group might have different points of view from the U.S.-born inhabitants on how to: promote their health, gain healthy lifestyles and habits, screen for diseases, adhere to health treatment and follow-up plans, perceive the importance of using available and affordable technology to communicate with their providers and improve their health, and help in making serious decisions for their health. The investigators aim to develop an educational and instructional health mobile application considering the language and cultural factors that affect immigrants' ability to access different health and social support sources, know their health rights and obligations in their communities, and improve their health behavior and behavior lifestyles.

Keywords: first-generation immigrant Muslim women, telehealth, COVID-19 pandemic, health information technology, health and digital literacy

Procedia PDF Downloads 59
53 A Single Cell Omics Experiments as Tool for Benchmarking Bioinformatics Oncology Data Analysis Tools

Authors: Maddalena Arigoni, Maria Luisa Ratto, Raffaele A. Calogero, Luca Alessandri

Abstract:

The presence of tumor heterogeneity, where distinct cancer cells exhibit diverse morphological and phenotypic profiles, including gene expression, metabolism, and proliferation, poses challenges for molecular prognostic markers and patient classification for targeted therapies. Understanding the causes and progression of cancer requires research efforts aimed at characterizing heterogeneity, which can be facilitated by evolving single-cell sequencing technologies. However, analyzing single-cell data necessitates computational methods that often lack objective validation. Therefore, the establishment of benchmarking datasets is necessary to provide a controlled environment for validating bioinformatics tools in the field of single-cell oncology. Benchmarking bioinformatics tools for single-cell experiments can be costly due to the high expense involved. Therefore, datasets used for benchmarking are typically sourced from publicly available experiments, which often lack a comprehensive cell annotation. This limitation can affect the accuracy and effectiveness of such experiments as benchmarking tools. To address this issue, we introduce omics benchmark experiments designed to evaluate bioinformatics tools to depict the heterogeneity in single-cell tumor experiments. We conducted single-cell RNA sequencing on six lung cancer tumor cell lines that display resistant clones upon treatment of EGFR mutated tumors and are characterized by driver genes, namely ROS1, ALK, HER2, MET, KRAS, and BRAF. These driver genes are associated with downstream networks controlled by EGFR mutations, such as JAK-STAT, PI3K-AKT-mTOR, and MEK-ERK. The experiment also featured an EGFR-mutated cell line. Using 10XGenomics platform with cellplex technology, we analyzed the seven cell lines together with a pseudo-immunological microenvironment consisting of PBMC cells labeled with the Biolegend TotalSeq™-B Human Universal Cocktail (CITEseq). This technology allowed for independent labeling of each cell line and single-cell analysis of the pooled seven cell lines and the pseudo-microenvironment. The data generated from the aforementioned experiments are available as part of an online tool, which allows users to define cell heterogeneity and generates count tables as an output. The tool provides the cell line derivation for each cell and cell annotations for the pseudo-microenvironment based on CITEseq data by an experienced immunologist. Additionally, we created a range of pseudo-tumor tissues using different ratios of the aforementioned cells embedded in matrigel. These tissues were analyzed using 10XGenomics (FFPE samples) and Curio Bioscience (fresh frozen samples) platforms for spatial transcriptomics, further expanding the scope of our benchmark experiments. The benchmark experiments we conducted provide a unique opportunity to evaluate the performance of bioinformatics tools for detecting and characterizing tumor heterogeneity at the single-cell level. Overall, our experiments provide a controlled and standardized environment for assessing the accuracy and robustness of bioinformatics tools for studying tumor heterogeneity at the single-cell level, which can ultimately lead to more precise and effective cancer diagnosis and treatment.

Keywords: single cell omics, benchmark, spatial transcriptomics, CITEseq

Procedia PDF Downloads 78
52 Optimization and Coordination of Organic Product Supply Chains under Competition: An Analytical Modeling Perspective

Authors: Mohammadreza Nematollahi, Bahareh Mosadegh Sedghy, Alireza Tajbakhsh

Abstract:

The last two decades have witnessed substantial attention to organic and sustainable agricultural supply chains. Motivated by real-world practices, this paper aims to address two main challenges observed in organic product supply chains: decentralized decision-making process between farmers and their retailers, and competition between organic products and their conventional counterparts. To this aim, an agricultural supply chain consisting of two farmers, a conventional farmer and an organic farmer who offers an organic version of the same product, is considered. Both farmers distribute their products through a single retailer, where there exists competition between the organic and the conventional product. The retailer, as the market leader, sets the wholesale price, and afterward, the farmers set their production quantity decisions. This paper first models the demand functions of the conventional and organic products by incorporating the effect of asymmetric brand equity, which captures the fact that consumers usually pay a premium for organic due to positive perceptions regarding their health and environmental benefits. Then, profit functions with consideration of some characteristics of organic farming, including crop yield gap and organic cost factor, are modeled. Our research also considers both economies and diseconomies of scale in farming production as well as the effects of organic subsidy paid by the government to support organic farming. This paper explores the investigated supply chain in three scenarios: decentralized, centralized, and coordinated decision-making structures. In the decentralized scenario, the conventional and organic farmers and the retailer maximize their own profits individually. In this case, the interaction between the farmers is modeled under the Bertrand competition, while analyzing the interaction between the retailer and farmers under the Stackelberg game structure. In the centralized model, the optimal production strategies are obtained from the entire supply chain perspective. Analytical models are developed to derive closed-form optimal solutions. Moreover, analytical sensitivity analyses are conducted to explore the effects of main parameters like the crop yield gap, organic cost factor, organic subsidy, and percent price premium of the organic product on the farmers’ and retailer’s optimal strategies. Afterward, a coordination scenario is proposed to convince the three supply chain members to shift from the decentralized to centralized decision-making structure. The results indicate that the proposed coordination scenario provides a win-win-win situation for all three members compared to the decentralized model. Moreover, our paper demonstrates that the coordinated model respectively increases and decreases the production and price of organic produce, which in turn motivates the consumption of organic products in the market. Moreover, the proposed coordination model helps the organic farmer better handle the challenges of organic farming, including the additional cost and crop yield gap. Last but not least, our results highlight the active role of the organic subsidy paid by the government as a means of promoting sustainable organic product supply chains. Our paper shows that although the amount of organic subsidy plays a significant role in the production and sales price of organic products, the allocation method of subsidy between the organic farmer and retailer is not of that importance.

Keywords: analytical game-theoretic model, product competition, supply chain coordination, sustainable organic supply chain

Procedia PDF Downloads 87
51 Analyzing the Effectiveness of Elderly Design and the Impact on Sustainable Built Environment

Authors: Tristance Kee

Abstract:

With an unprecedented increase in elderly population around the world, the severe lack of quality housing and health-and-safety provisions to serve this cohort cannot be ignored any longer. Many elderly citizens, especially singletons, live in unsafe housing conditions with poorly executed planning and design. Some suffer from deteriorating mobility, sight and general alertness and their sub-standard living conditions further hinder their daily existence. This research explains how concepts such as Universal Design and Co-Design operate in a high density city such as Hong Kong, China where innovative design can become an alternative solution where government and the private sector fail to provide quality elderly friendly facilities to promote a sustainable urban development. Unlike other elderly research which focuses more on housing policies, nursing care and theories, this research takes a more progressive approach by providing an in-depth impact assessment on how innovative design can be practical solutions for creating a more sustainable built environment. The research objectives are to: 1) explain the relationship between innovative design for elderly and a healthier and sustainable environment; 2) evaluate the impact of human ergonomics with the use of universal design; and 3) explain how innovation can enhance the sustainability of a city in improving citizen’s sight, sound, walkability and safety within the ageing population. The research adopts both qualitative and quantitative methodologies to examine ways to improve elderly population’s relationship to our built environment. In particular, the research utilizes collected data from questionnaire survey and focus group discussions to obtain inputs from various stakeholders, including designers, operators and managers related to public housing, community facilities and overall urban development. In addition to feedbacks from end-users and stakeholders, a thorough analysis on existing elderly housing facilities and Universal Design provisions are examined to evaluate their adequacy. To echo the theme of this conference on Innovation and Sustainable Development, this research examines the effectiveness of innovative design in a risk-benefit factor assessment. To test the hypothesis that innovation can cater for a sustainable development, the research evaluated the health improvement of a sample size of 150 elderly in a period of eight months. Their health performances, including mobility, speech and memory are monitored and recorded on a regular basis to assess if the use of innovation does trigger impact on improving health and home safety for an elderly cohort. This study was supported by district community centers under the auspices of Home Affairs Bureau to provide respondents for questionnaire survey, a standardized evaluation mechanism, and professional health care staff for evaluating the performance impact. The research findings will be integrated to formulate design solutions such as innovative home products to improve elderly daily experience and safety with a particular focus on the enhancement on sight, sound and mobility safety. Some policy recommendations and architectural planning recommendations related to Universal Design will also be incorporated into the research output for future planning of elderly housing and amenity provisions.

Keywords: elderly population, innovative design, sustainable built environment, universal design

Procedia PDF Downloads 203
50 Green Building Risks: Limits on Environmental and Health Quality Metrics for Contractors

Authors: Erica Cochran Hameen, Bobuchi Ken-Opurum, Mounica Guturu

Abstract:

The United Stated (U.S.) populous spends the majority of their time indoors in spaces where building codes and voluntary sustainability standards provide clear Indoor Environmental Quality (IEQ) metrics. The existing sustainable building standards and codes are aimed towards improving IEQ, health of occupants, and reducing the negative impacts of buildings on the environment. While they address the post-occupancy stage of buildings, there are fewer standards on the pre-occupancy stage thereby placing a large labor population in environments much less regulated. Construction personnel are often exposed to a variety of uncomfortable and unhealthy elements while on construction sites, primarily thermal, visual, acoustic, and air quality related. Construction site power generators, equipment, and machinery generate on average 9 decibels (dBA) above the U.S. OSHA regulations, creating uncomfortable noise levels. Research has shown that frequent exposure to high noise levels leads to chronic physiological issues and increases noise induced stress, yet beyond OSHA no other metric focuses directly on the impacts of noise on contractors’ well-being. Research has also associated natural light with higher productivity and attention span, and lower cases of fatigue in construction workers. However, daylight is not always available as construction workers often perform tasks in cramped spaces, dark areas, or at nighttime. In these instances, the use of artificial light is necessary, yet lighting standards for use during lengthy tasks and arduous activities is not specified. Additionally, ambient air, contaminants, and material off-gassing expelled at construction sites are one of the causes of serious health effects in construction workers. Coupled with extreme hot and cold temperatures for different climate zones, health and productivity can be seriously compromised. This research evaluates the impact of existing green building metrics on construction and risk management, by analyzing two codes and nine standards including LEED, WELL, and BREAM. These metrics were chosen based on the relevance to the U.S. construction industry. This research determined that less than 20% of the sustainability context within the standards and codes (texts) are related to the pre-occupancy building sector. The research also investigated the impact of construction personnel’s health and well-being on construction management through two surveys of project managers and on-site contractors’ perception of their work environment on productivity. To fully understand the risks of limited Environmental and Health Quality metrics for contractors (EHQ) this research evaluated the connection between EHQ factors such as inefficient lighting, on construction workers and investigated the correlation between various site coping strategies for comfort and productivity. Outcomes from this research are three-pronged. The first includes fostering a discussion about the existing conditions of EQH elements, i.e. thermal, lighting, ergonomic, acoustic, and air quality on the construction labor force. The second identifies gaps in sustainability standards and codes during the pre-occupancy stage of building construction from ground-breaking to substantial completion. The third identifies opportunities for improvements and mitigation strategies to improve EQH such as increased monitoring of effects on productivity and health of contractors and increased inclusion of the pre-occupancy stage in green building standards.

Keywords: construction contractors, health and well-being, environmental quality, risk management

Procedia PDF Downloads 108
49 Book Exchange System with a Hybrid Recommendation Engine

Authors: Nilki Upathissa, Torin Wirasinghe

Abstract:

This solution addresses the challenges faced by traditional bookstores and the limitations of digital media, striking a balance between the tactile experience of printed books and the convenience of modern technology. The book exchange system offers a sustainable alternative, empowering users to access a diverse range of books while promoting community engagement. The user-friendly interfaces incorporated into the book exchange system ensure a seamless and enjoyable experience for users. Intuitive features for book management, search, and messaging facilitate effortless exchanges and interactions between users. By streamlining the process, the system encourages readers to explore new books aligned with their interests, enhancing the overall reading experience. Central to the system's success is the hybrid recommendation engine, which leverages advanced technologies such as Long Short-Term Memory (LSTM) models. By analyzing user input, the engine accurately predicts genre preferences, enabling personalized book recommendations. The hybrid approach integrates multiple technologies, including user interfaces, machine learning models, and recommendation algorithms, to ensure the accuracy and diversity of the recommendations. The evaluation of the book exchange system with the hybrid recommendation engine demonstrated exceptional performance across key metrics. The high accuracy score of 0.97 highlights the system's ability to provide relevant recommendations, enhancing users' chances of discovering books that resonate with their interests. The commendable precision, recall, and F1score scores further validate the system's efficacy in offering appropriate book suggestions. Additionally, the curve classifications substantiate the system's effectiveness in distinguishing positive and negative recommendations. This metric provides confidence in the system's ability to navigate the vast landscape of book choices and deliver recommendations that align with users' preferences. Furthermore, the implementation of this book exchange system with a hybrid recommendation engine has the potential to revolutionize the way readers interact with printed books. By facilitating book exchanges and providing personalized recommendations, the system encourages a sense of community and exploration within the reading community. Moreover, the emphasis on sustainability aligns with the growing global consciousness towards eco-friendly practices. With its robust technical approach and promising evaluation results, this solution paves the way for a more inclusive, accessible, and enjoyable reading experience for book lovers worldwide. In conclusion, the developed book exchange system with a hybrid recommendation engine represents a progressive solution to the challenges faced by traditional bookstores and the limitations of digital media. By promoting sustainability, widening access to printed books, and fostering engagement with reading, this system addresses the evolving needs of book enthusiasts. The integration of user-friendly interfaces, advanced machine learning models, and recommendation algorithms ensure accurate and diverse book recommendations, enriching the reading experience for users.

Keywords: recommendation systems, hybrid recommendation systems, machine learning, data science, long short-term memory, recurrent neural network

Procedia PDF Downloads 57
48 A Postmodern Framework for Quranic Hermeneutics

Authors: Christiane Paulus

Abstract:

Post-Islamism assumes that the Quran should not be viewed in terms of what Lyotard identifies as a ‘meta-narrative'. However, its socio-ethical content can be viewed as critical of power discourse (Foucault). Practicing religion seems to be limited to rites and individual spirituality, taqwa. Alternatively, can we build on Muhammad Abduh's classic-modern reform and develop it through a postmodernist frame? This is the main question of this study. Through his general and vague remarks on the context of the Quran, Abduh was the first to refer to the historical and cultural distance of the text as an obstacle for interpretation. His application, however, corresponded to the modern absolute idea of authentic sharia. He was followed by Amin al-Khuli, who hermeneutically linked the content of the Quran to the theory of evolution. Fazlur Rahman and Nasr Hamid abu Zeid remain reluctant to go beyond the general level in terms of context. The hermeneutic circle, therefore, persists in challenging, how to get out to overcome one’s own assumptions. The insight into and the acceptance of the lasting ambivalence of understanding can be grasped as a postmodern approach; it is documented in Derrida's discovery of the shift in text meanings, difference, also in Lyotard's theory of différend. The resulting mixture of meanings (Wolfgang Welsch) can be read together with the classic ambiguity of the premodern interpreters of the Quran (Thomas Bauer). Confronting hermeneutic difficulties in general, Niklas Luhmann proves every description an attribution, tautology, i.e., remaining in the circle. ‘De-tautologization’ is possible, namely by analyzing the distinctions in the sense of objective, temporal and social information that every text contains. This could be expanded with the Kantian aesthetic dimension of reason (critique of pure judgment) corresponding to the iʽgaz of the Coran. Luhmann asks, ‘What distinction does the observer/author make?’ Quran as a speech from God to the first listeners could be seen as a discourse responding to the problems of everyday life of that time, which can be viewed as the general goal of the entire Qoran. Through reconstructing koranic Lifeworlds (Alfred Schütz) in detail, the social structure crystallizes the socio-economic differences, the enormous poverty. The koranic instruction to provide the basic needs for the neglected groups, which often intersect (old, poor, slaves, women, children), can be seen immediately in the text. First, the references to lifeworlds/social problems and discourses in longer koranic passages should be hypothesized. Subsequently, information from the classic commentaries could be extracted, the classical Tafseer, in particular, contains rich narrative material for reconstructing. By selecting and assigning suitable, specific context information, the meaning of the description becomes condensed (Clifford Geertz). In this manner, the text gets necessarily an alienation and is newly accessible. The socio-ethical implications can thus be grasped from the difference of the original problem and the revealed/improved order/procedure; this small step can be materialized as such, not as an absolute solution but as offering plausible patterns for today’s challenges as the Agenda 2030.

Keywords: postmodern hermeneutics, condensed description, sociological approach, small steps of reform

Procedia PDF Downloads 190
47 How Can Personal Protective Equipment Be Best Used and Reused: A Human Factors based Look at Donning and Doffing Procedures

Authors: Devin Doos, Ashley Hughes, Trang Pham, Paul Barach, Rami Ahmed

Abstract:

Over 115,000 Health Care Workers (HCWs) have died from COVID-19, and millions have been infected while caring for patients. HCWs have filed thousands of safety complaints surrounding safety concerns due to Personal Protective Equipment (PPE) shortages, which included concerns around inadequate and PPE reuse. Protocols for donning and doffing PPE remain ambiguous, lacking an evidence-base, and often result in wide deviations in practice. PPE donning and doffing protocol deviations commonly result in self-contamination but have not been thoroughly addressed. No evidence-driven protocols provide guidance on protecting HCW during periods of PPE reuse. Objective: The aim of this study was to examine safety-related threats and risks to Health Care Workers (HCWs) due to the reuse of PPE among Emergency Department personnel. Method: We conducted a prospective observational study to examine the risks of reusing PPE. First, ED personnel were asked to don and doff PPE in a simulation lab. Each participant was asked to don and doff PPE five times, according to the maximum reuse recommendation set by the Centers for Disease Control and Prevention (CDC). Each participant was videorecorded; video recordings were reviewed and coded independently by at least 2 of the 3trained coders for safety behaviors and riskiness of actions. A third coder was brought in when the agreement between the 2 coders could not be reached. Agreement between coders was high (81.9%), and all disagreements (100%) were resolved via consensus. A bowtie risk assessment chart was constructed analyzing the factors that contribute to increased risks HCW are faced with due to PPE use and reuse. Agreement amongst content experts in the field of Emergency Medicine, Human Factors, and Anesthesiology was used to select aspects of health care that both contribute and mitigate risks associated with PPE reuse. Findings: Twenty-eight clinician participants completed five rounds of donning/doffing PPE, yielding 140 PPE donning/doffing sequences. Two emerging threats were associated with behaviors in donning, doffing, and re-using PPE: (i) direct exposure to contaminant, and (ii) transmission/spread of contaminant. Protective behaviors included: hand hygiene, not touching the patient-facing surface of PPE, and ensuring a proper fit and closure of all PPE materials. 100% of participants (n= 28) deviated from the CDC recommended order, and most participants (92.85%, n=26) self-contaminated at least once during reuse. Other frequent errors included failure to tie all ties on the PPE (92.85%, n=26) and failure to wash hands after a contamination event occurred (39.28%, n=11). Conclusions: There is wide variation and regular errors in how HCW don and doffPPE while including in reusing PPE that led to self-contamination. Some errors were deemed “recoverable”, such as hand washing after touching a patient-facing surface to remove the contaminant. Other errors, such as using a contaminated mask and accidentally spreading to the neck and face, can lead to compound risks that are unique to repeated PPE use. A more comprehensive understanding of the contributing threats to HCW safety and complete approach to mitigating underlying risks, including visualizing with risk management toolsmay, aid future PPE designand workflow and space solutions.

Keywords: bowtie analysis, health care, PPE reuse, risk management

Procedia PDF Downloads 62
46 Combination of Modelling and Environmental Life Cycle Assessment Approach for Demand Driven Biogas Production

Authors: Juan A. Arzate, Funda C. Ertem, M. Nicolas Cruz-Bournazou, Peter Neubauer, Stefan Junne

Abstract:

— One of the biggest challenges the world faces today is global warming that is caused by greenhouse gases (GHGs) coming from the combustion of fossil fuels for energy generation. In order to mitigate climate change, the European Union has committed to reducing GHG emissions to 80–95% below the level of the 1990s by the year 2050. Renewable technologies are vital to diminish energy-related GHG emissions. Since water and biomass are limited resources, the largest contributions to renewable energy (RE) systems will have to come from wind and solar power. Nevertheless, high proportions of fluctuating RE will present a number of challenges, especially regarding the need to balance the variable energy demand with the weather dependent fluctuation of energy supply. Therefore, biogas plants in this content would play an important role, since they are easily adaptable. Feedstock availability varies locally or seasonally; however there is a lack of knowledge in how biogas plants should be operated in a stable manner by local feedstock. This problem may be prevented through suitable control strategies. Such strategies require the development of convenient mathematical models, which fairly describe the main processes. Modelling allows us to predict the system behavior of biogas plants when different feedstocks are used with different loading rates. Life cycle assessment (LCA) is a technique for analyzing several sides from evolution of a product till its disposal in an environmental point of view. It is highly recommend to use as a decision making tool. In order to achieve suitable strategies, the combination of a flexible energy generation provided by biogas plants, a secure production process and the maximization of the environmental benefits can be obtained by the combination of process modelling and LCA approaches. For this reason, this study focuses on the biogas plant which flexibly generates required energy from the co-digestion of maize, grass and cattle manure, while emitting the lowest amount of GHG´s. To achieve this goal AMOCO model was combined with LCA. The program was structured in Matlab to simulate any biogas process based on the AMOCO model and combined with the equations necessary to obtain climate change, acidification and eutrophication potentials of the whole production system based on ReCiPe midpoint v.1.06 methodology. Developed simulation was optimized based on real data from operating biogas plants and existing literature research. The results prove that AMOCO model can successfully imitate the system behavior of biogas plants and the necessary time required for the process to adapt in order to generate demanded energy from available feedstock. Combination with LCA approach provided opportunity to keep the resulting emissions from operation at the lowest possible level. This would allow for a prediction of the process, when the feedstock utilization supports the establishment of closed material circles within a smart bio-production grid – under the constraint of minimal drawbacks for the environment and maximal sustainability.

Keywords: AMOCO model, GHG emissions, life cycle assessment, modelling

Procedia PDF Downloads 168
45 Energy Audit and Renovation Scenarios for a Historical Building in Rome: A Pilot Case Towards the Zero Emission Building Goal

Authors: Domenico Palladino, Nicolandrea Calabrese, Francesca Caffari, Giulia Centi, Francesca Margiotta, Giovanni Murano, Laura Ronchetti, Paolo Signoretti, Lisa Volpe, Silvia Di Turi

Abstract:

The aim to achieve a fully decarbonized building stock by 2050 stands as one of the most challenging issues within the spectrum of energy and climate objectives. Numerous strategies are imperative, particularly emphasizing the reduction and optimization of energy demand. Ensuring the high energy performance of buildings emerges as a top priority, with measures aimed at cutting energy consumptions. Concurrently, it is imperative to decrease greenhouse gas emissions by using renewable energy sources for the on-site energy production, thereby striving for an energy balance leading towards zero-emission buildings. Italy's predominant building stock comprises ancient buildings, many of which hold historical significance and are subject to stringent preservation and conservation regulations. Attaining high levels of energy efficiency and reducing CO2 emissions in such buildings poses a considerable challenge, given their unique characteristics and the imperative to adhere to principles of conservation and restoration. Additionally, conducting a meticulous analysis of these buildings' current state is crucial for accurately quantifying their energy performance and predicting the potential impacts of proposed renovation strategies on energy consumption reduction. Within this framework, the paper presents a pilot case in Rome, outlining a methodological approach for the renovation of historic buildings towards achieving Zero Emission Building (ZEB) objective. The building has a mixed function with offices, a conference hall, and an exposition area. The building envelope is made of historical and precious materials used as cladding which must be preserved. A thorough understanding of the building's current condition serves as a prerequisite for analyzing its energy performance. This involves conducting comprehensive archival research, undertaking on-site diagnostic examinations to characterize the building envelope and its systems, and evaluating actual energy usage data derived from energy bills. Energy simulations and audit are the first step in the analysis with the assessment of the energy performance of the actual current state. Subsequently, different renovation scenarios are proposed, encompassing advanced building techniques, to pinpoint the key actions necessary for improving mechanical systems, automation and control systems, and the integration of renewable energy production. These scenarios entail different levels of renovation, ranging from meeting minimum energy performance goals to achieving the highest possible energy efficiency level. The proposed interventions are meticulously analyzed and compared to ascertain the feasibility of attaining the Zero Emission Building objective. In conclusion, the paper provides valuable insights that can be extrapolated to inform a broader approach towards energy-efficient refurbishment of historical buildings that may have limited potential for renovation in their building envelopes. By adopting a methodical and nuanced approach, it is possible to reconcile the imperative of preserving cultural heritage with the pressing need to transition towards a sustainable, low-carbon future.

Keywords: energy conservation and transition, energy efficiency in historical buildings, buildings energy performance, energy retrofitting, zero emission buildings, energy simulation

Procedia PDF Downloads 26
44 Transformation of Bangladesh Society: The Role of Religion

Authors: Abdul Wohab

Abstract:

Context: The role of religion in the transformation of Bangladesh society has been significant since 1975. There has been a rise in religious presence, particularly Islam, in both private and public spheres supported by the state apparatuses. In 2009, a 'secular' political party came into power for the second time since independence and initiated the modernization of religious education systems. This research focuses on the transformation observed among the educated middle class who now prefer their children to attend modern, English medium madrasas that offer both religion-based and secular education. Research Aim: This research aims to investigate two main questions: a) what motivates the educated middle class to send their children to madrasa education? b) To what extent can it be argued that Bangladeshi society is transforming from its secular nature to being more religious?Methodology: The research applies a combination of primary and secondary methods. Case studies serve as the primary method, allowing for an in-depth exploration of the motivations of the educated middle class. The secondary method involves analyzing published news articles, op-eds, and websites related to madrasa education, as well as studying the reading syllabus of Aliya and Qwami madrasas in Bangladesh. Findings: Preliminary findings indicate that the educated middle class chooses madrasa education for reasons such as remembering and praying for their departed relatives, keeping their children away from substance abuse, fostering moral and ethical values, and instilling respect for seniors and relatives. The research also reveals that religious education is believed to help children remain morally correct according to the Quran and Hadith. Additionally, the establishment of madrasas in Bangladesh is attributed to economic factors, with demand and supply mechanisms playing a significant role. Furthermore, the findings suggest that government-run primary education institutions in rural areas face more challenges in enrollment compared to religious educational institutions like madrasas. Theoretical Importance: This research contributes to the understanding of societal transformation and the role of religion in this process. By examining the case of Bangladesh, it provides insights into how religion influences education choices and societal values. Data Collection and Analysis Procedures: Data for this research is collected through case studies, including interviews and observations of educated middle-class families who send their children to madrasas. In addition, analysis is conducted on relevant published materials such as news articles, op-eds, and websites. The reading syllabus of Aliya and Qwami madrasas is also analyzed to gain a comprehensive understanding of the education system. Questions Addressed: The research addresses two questions: a) what motivates the educated middle class to choose madrasa education for their children? b) To what extent can it be argued that Bangladeshi society is transforming from its secular nature to being more religious?Conclusion: The preliminary findings of this research highlight the motivations of the educated middle class in opting for madrasa education, including the desire to maintain religious traditions, promote moral values, and provide a strong foundation for their children. It also suggests that Bangladeshi society is experiencing a transformation towards a more religious orientation. This research contributes to the understanding of societal changes and the role of religion within Bangladesh, shedding light on the complex dynamics between religion and education.

Keywords: madrasa education, transformation, Bangladesh, religion and society, education

Procedia PDF Downloads 36
43 About the State of Students’ Career Guidance in the Conditions of Inclusive Education in the Republic of Kazakhstan

Authors: Laura Butabayeva, Svetlana Ismagulova, Gulbarshin Nogaibayeva, Maiya Temirbayeva, Aidana Zhussip

Abstract:

Over the years of independence, Kazakhstan has not only ratified international documents regulating the rights of children to Inclusive education, but also developed its own inclusive educational policy. Along with this, the state pays particular attention to high school students' preparedness for professional self-determination. However, a number of problematic issues in this field have been revealed, such as the lack of systemic mechanisms coordinating stakeholders’ actions in preparing schoolchildren for a conscious choice of in-demand profession, meeting their individual capabilities and special educational needs (SEN). The analysis of the state’s current situation indicates school graduates’ adaptation to the labor market does not meet existing demands of the society. According to the Ministry of Labor and Social Protection of the Population of the Republic of Kazakhstan, about 70 % of Kazakhstani school graduates find themselves difficult to choose a profession, 87 % of schoolchildren make their career choice under the influence of parents and school teachers, 90 % of schoolchildren and their parents have no idea about the most popular professions on the market. The results of the study conducted by KorlanSyzdykova in 2016 indicated the urgent need of Kazakhstani school graduates in obtaining extensive information about in- demand professions and receiving professional assistance in choosing a profession in accordance with their individual skills, abilities, and preferences. The results of the survey, conducted by Information and Analytical Center among heads of colleges in 2020, showed that despite significant steps in creating conditions for students with SEN, they face challenges in studying because of poor career guidance provided to them in schools. The results of the study, conducted by the Center for Inclusive Education of the National Academy of Education named after Y. Altynsarin in the state’s general education schools in 2021, demonstrated the lack of career guidance, pedagogical and psychological support for children with SEN. To investigate these issues, the further study was conducted to examine the state of students’ career guidance and socialization, taking into account their SEN. The hypothesis of this study proposed that to prepare school graduates for a conscious career choice, school teachers and specialists need to develop their competencies in early identification of students' interests, inclinations, SEN and ensure necessary support for them. The state’s 5 regions were involved in the study according to the geographical location. The triangulation approach was utilized to ensure the credibility and validity of research findings, including both theoretical (analysis of existing statistical data, legal documents, results of previous research) and empirical (school survey for students, interviews with parents, teachers, representatives of school administration) methods. The data were analyzed independently and compared to each other. The survey included questions related to provision of pedagogical support for school students in making their career choice. Ethical principles were observed in the process of developing the methodology, collecting, analyzing the data and distributing the results. Based on the results, methodological recommendations on students’ career guidance for school teachers and specialists were developed, taking into account the former’s individual capabilities and SEN.

Keywords: career guidance, children with special educational needs, inclusive education, Kazakhstan

Procedia PDF Downloads 134
42 Particle Size Characteristics of Aerosol Jets Produced by A Low Powered E-Cigarette

Authors: Mohammad Shajid Rahman, Tarik Kaya, Edgar Matida

Abstract:

Electronic cigarettes, also known as e-cigarettes, may have become a tool to improve smoking cessation due to their ability to provide nicotine at a selected rate. Unlike traditional cigarettes, which produce toxic elements from tobacco combustion, e-cigarettes generate aerosols by heating a liquid solution (commonly a mixture of propylene glycol, vegetable glycerin, nicotine and some flavoring agents). However, caution still needs to be taken when using e-cigarettes due to the presence of addictive nicotine and some harmful substances produced from the heating process. Particle size distribution (PSD) and associated velocities generated by e-cigarettes have significant influence on aerosol deposition in different regions of human respiratory tracts. On another note, low actuation power is beneficial in aerosol generating devices since it exhibits a reduced emission of toxic chemicals. In case of e-cigarettes, lower heating powers can be considered as powers lower than 10 W compared to a wide range of powers (0.6 to 70.0 W) studied in literature. Due to the importance regarding inhalation risk reduction, deeper understanding of particle size characteristics of e-cigarettes demands thorough investigation. However, comprehensive study on PSD and velocities of e-cigarettes with a standard testing condition at relatively low heating powers is still lacking. The present study aims to measure particle number count and size distribution of undiluted aerosols of a latest fourth-generation e-cigarette at low powers, within 6.5 W using real-time particle counter (time-of-flight method). Also, temporal and spatial evolution of particle size and velocity distribution of aerosol jets are examined using phase Doppler anemometry (PDA) technique. To the authors’ best knowledge, application of PDA in e-cigarette aerosol measurement is rarely reported. In the present study, preliminary results about particle number count of undiluted aerosols measured by time-of-flight method depicted that an increase of heating power from 3.5 W to 6.5 W resulted in an enhanced asymmetricity in PSD, deviating from log-normal distribution. This can be considered as an artifact of rapid vaporization, condensation and coagulation processes on aerosols caused by higher heating power. A novel mathematical expression, combining exponential, Gaussian and polynomial (EGP) distributions, was proposed to describe asymmetric PSD successfully. The value of count median aerodynamic diameter and geometric standard deviation laid within a range of about 0.67 μm to 0.73 μm, and 1.32 to 1.43, respectively while the power varied from 3.5 W to 6.5 W. Laser Doppler velocimetry (LDV) and PDA measurement suggested a typical centerline streamwise mean velocity decay of aerosol jet along with a reduction of particle sizes. In the final submission, a thorough literature review, detailed description of experimental procedure and discussion of the results will be provided. Particle size and turbulent characteristics of aerosol jets will be further examined, analyzing arithmetic mean diameter, volumetric mean diameter, volume-based mean diameter, streamwise mean velocity and turbulence intensity. The present study has potential implications in PSD simulation and validation of aerosol dosimetry model, leading to improving related aerosol generating devices.

Keywords: E-cigarette aerosol, laser doppler velocimetry, particle size distribution, particle velocity, phase Doppler anemometry

Procedia PDF Downloads 17
41 Optimization of Geometric Parameters of Microfluidic Channels for Flow-Based Studies

Authors: Parth Gupta, Ujjawal Singh, Shashank Kumar, Mansi Chandra, Arnab Sarkar

Abstract:

Microfluidic devices have emerged as indispensable tools across various scientific disciplines, offering precise control and manipulation of fluids at the microscale. Their efficacy in flow-based research, spanning engineering, chemistry, and biology, relies heavily on the geometric design of microfluidic channels. This work introduces a novel approach to optimise these channels through Response Surface Methodology (RSM), departing from the conventional practice of addressing one parameter at a time. Traditionally, optimising microfluidic channels involved isolated adjustments to individual parameters, limiting the comprehensive understanding of their combined effects. In contrast, our approach considers the simultaneous impact of multiple parameters, employing RSM to efficiently explore the complex design space. The outcome is an innovative microfluidic channel that consumes an optimal sample volume and minimises flow time, enhancing overall efficiency. The relevance of geometric parameter optimization in microfluidic channels extends significantly in biomedical engineering. The flow characteristics of porous materials within these channels depend on many factors, including fluid viscosity, environmental conditions (such as temperature and humidity), and specific design parameters like sample volume, channel width, channel length, and substrate porosity. This intricate interplay directly influences the performance and efficacy of microfluidic devices, which, if not optimized, can lead to increased costs and errors in disease testing and analysis. In the context of biomedical applications, the proposed approach addresses the critical need for precision in fluid flow. it mitigate manufacturing costs associated with trial-and-error methodologies by optimising multiple geometric parameters concurrently. The resulting microfluidic channels offer enhanced performance and contribute to a streamlined, cost-effective process for testing and analyzing diseases. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing.

Keywords: microfluidic device, minitab, statistical optimization, response surface methodology

Procedia PDF Downloads 30
40 Ethical Considerations of Disagreements Between Clinicians and Artificial Intelligence Recommendations: A Scoping Review

Authors: Adiba Matin, Daniel Cabrera, Javiera Bellolio, Jasmine Stewart, Dana Gerberi (librarian), Nathan Cummins, Fernanda Bellolio

Abstract:

OBJECTIVES: Artificial intelligence (AI) tools are becoming more prevalent in healthcare settings, particularly for diagnostic and therapeutic recommendations, with an expected surge in the incoming years. The bedside use of this technology for clinicians opens the possibility of disagreements between the recommendations from AI algorithms and clinicians’ judgment. There is a paucity in the literature analyzing nature and possible outcomes of these potential conflicts, particularly related to ethical considerations. The goal of this scoping review is to identify, analyze and classify current themes and potential strategies addressing ethical conflicts originating from the conflict between AI and human recommendations. METHODS: A protocol was written prior to the initiation of the study. Relevant literature was searched by a medical librarian for the terms of artificial intelligence, healthcare and liability, ethics, or conflict. Search was run in 2021 in Ovid Cochrane Central Register of Controlled Trials, Embase, Medline, IEEE Xplore, Scopus, and Web of Science Core Collection. Articles describing the role of AI in healthcare that mentioned conflict between humans and AI were included in the primary search. Two investigators working independently and in duplicate screened titles and abstracts and reviewed full-text of potentially eligible studies. Data was abstracted into tables and reported by themes. We followed methodological guidelines for Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR). RESULTS: Of 6846 titles and abstracts, 225 full texts were selected, and 48 articles included in this review. 23 articles were included as original research and review papers. 25 were included as editorials and commentaries with similar themes. There was a lack of consensus in the included articles on who would be held liable for mistakes incurred by following AI recommendations. It appears that there is a dichotomy of the perceived ethical consequences depending on if the negative outcome is a result of a human versus AI conflict or secondary to a deviation from standard of care. Themes identified included transparency versus opacity of recommendations, data bias, liability of outcomes, regulatory framework, and the overall scope of artificial intelligence in healthcare. A relevant issue identified was the concern by clinicians of the “black box” nature of these recommendations and the ability to judge appropriateness of AI guidance. CONCLUSION AI clinical tools are being rapidly developed and adopted, and the use of this technology will create conflicts between AI algorithms and healthcare workers with various outcomes. In turn, these conflicts may have legal, and ethical considerations. There is limited consensus about the focus of ethical and liability for outcomes originated from disagreements. This scoping review identified the importance of framing the problem in terms of conflict between standard of care or not, and informed by the themes of transparency/opacity, data bias, legal liability, absent regulatory frameworks and understanding of the technology. Finally, limited recommendations to mitigate ethical conflicts between AI and humans have been identified. Further work is necessary in this field.

Keywords: ethics, artificial intelligence, emergency medicine, review

Procedia PDF Downloads 69
39 A Foucauldian Analysis of Postcolonial Hybridity in a Kuwaiti Novel

Authors: Annette Louise Dupont

Abstract:

Background and Introduction: Broadly defined, hybridity is a condition of racial and cultural ‘cross-pollination’ which arises as a result of contact between colonized and colonizer. It remains a highly contested concept in postcolonial studies as it is implicitly underpinned by colonial notions of ‘racial purity.’ While some postcolonial scholars argue that individuals exercise significant agency in the construction of their hybrid subjectivities, others underscore associated experiences of exclusion, marginalization, and alienation. Kuwait and the Philippines are among the most disparate of contemporary postcolonial states. While oil resources transformed the former British Mandate of Kuwait into one of the world’s richest countries, enduring poverty in the former US colony of the Philippines drives a global diaspora which produces multiple Filipino hybridities. Although more Filipinos work in the Arabian Gulf than in any other region of the world, scholarly and literary accounts of their experiences of hybridization in this region are relatively scarce when compared to those set in North America, Australia, Asia, and Europe. Study Aims and Significance: This paper aims to address this existing lacuna by investigating hybridity and other postcolonial themes in a novel by a Kuwaiti author which vividly portrays the lives of immigrants and citizens in Kuwait and which gives a rare voice and insight into the struggles of an Arab-Filipino and European-Filipina. Specifically, this paper explores the relationships between colonial discourses of ‘black’ and ‘white’ and postcolonial discourses pertaining to ‘brown’ Filipinos and ‘brown’ Arabs, in order to assess their impacts on the protagonists’ hybrid subjectivities. Methodology: Foucault’s notions of discourse not only provide a conceptual basis for analyzing the colonial ideology of Orientalism, but his theories related to the social exclusion of the ‘mad’ also elucidate the mechanisms by which power can operate to marginalize, alienate and subjectify the Other, therefore a Foucauldian lens is applied to the analysis of postcolonial themes and hybrid subjectivities portrayed in the novel. Findings: The study finds that Kuwaiti and Filipino discursive practices mirror those of former white colonialists and colonized black laborers and that these discursive practices combine with a former British colonial system of foreign labor sponsorship to create a form of governmentality in Kuwait which is based on exclusion and control. The novel’s rich social description and the reflections of the key protagonist and narrator suggest that such fiction has a significant role to play in highlighting the historical and cultural specificities of experiences of postcolonial hybridity in under-researched geographic, economic, social, and political settings. Whereas hybridity can appear abstract in scholarly accounts, the significance of literary accounts in which the lived experiences of hybrid protagonists are anchored to specific historical periods, places and discourses, is that contextual particularities are neither obscured nor dehistoricized. Conclusions: The application of Foucauldian theorizations of discourse, disciplinary, and biopower to the analysis of this Kuwaiti literary text serves to extend an understanding of the effects of contextually-specific discourses on hybrid Filipino subjectivities, as well as a knowledge of prevailing social dynamics in a little-researched postcolonial Arabian Gulf state.

Keywords: Filipino, Foucault, hybridity, Kuwait

Procedia PDF Downloads 102
38 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 34