Search results for: media capabilities
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4119

Search results for: media capabilities

459 Shedding Light on Colorism: Exploring Stereotypes, Influential Factors, and Consequences in African American Communities

Authors: India Sanders, Jeffrey Sherman

Abstract:

Colorism has been a persistent and ingrained issue in the history of the United States, with far-reaching consequences that continue to affect various aspects of daily life, institutional policies, public spaces, economic structures, and social norms. This complex problem has had a particularly profound impact on the African-American community, shaping how they are perceived and treated within society at large. The prevalence of negative stereotypes surrounding African Americans can lead to severe repercussions such as discrimination and mental health disparities. The effects of such biases can also materialize in diverse forms, impacting the well-being and livelihoods of individuals within this community. Current research has examined how people from different racial groups perceive different skin tones of Black people, looking at the cognitive processes that manifest through categorization and stereotypes. Additionally, studies observed consequences related to colorism and how it directly affects those with darker versus lighter skin tones. However, not much research has been conducted on the influence of stereotypes associated with various skin tones. In the present study, it is hypothesized that participants in Group A will rate positive stereotypes associated with lighter skin tones significantly higher than positive stereotypes associated with darker skin tones. It is also hypothesized that participants in Group B will rate negative stereotypes associated with darker skin tones significantly higher than negative stereotypes associated with lighter skin tones. For this study, a quantitative study on stereotypes of skin tone representation within the African-American community will be conducted. Participants will rate the accuracy of various visual representations within mass media of African Americans with light skin tones and dark skin tones using a Likert scale. Participants will also be provided a questionnaire further examining the perception of stereotypes and how this affects their interactions with African Americans with lighter versus darker skin tones. The purpose of this study is to investigate the impact of skin tone portrayals on African Americans, including associated stereotypes and societal perceptions. It is expected that participants will more likely associate negative stereotypes with African Americans who have darker skin tones, as this is a common and reinforced viewpoint in the cultural and social system.

Keywords: colorism, discrimination, racism, stereotype

Procedia PDF Downloads 68
458 The Changes in Consumer Behavior and the Decision-making Process After Covid-19 in Greece

Authors: Markou Vasiliki, Serdaris Panagiotis

Abstract:

The consumer behavior and decision-making process of consumers is a process that is affected by the factor of uncertainty. The onslaught of the Covid 19 pandemic has changed the consumer decision-making process in many ways. This change can be seen both in the buying process (how and where they shop) but also in the types of goods and services they are looking for. In addition, due to the mainly economic uncertainty that came from this event, but also the effects on both society and the economy in general, new consumer behaviors were created. Traditional forms of shopping are no longer a primary choice, consumers have turned to digital channels such as e-commerce and social media to fulfill needs. The purpose of this particular article is to examine how much the consumer's decision-making process has been affected after the pandemic and if consumer behavior has changed. An online survey was conducted to examine the change in decision making. Essentially, the demographic factors that influence the decision-making process were examined, as well as the social and economic factors. The research is divided into two parts. The first part included a literature review of the research that has been carried out to identify the factors, and the second part where the empirical investigation was carried out using a questionnaire and was done electronically with the help of Google Forms. The questionnaire was divided into several sections. They included questions about consumer behavior, but mainly about how they make decisions today, whether those decisions have changed due to the pandemic, and whether those changes are permanent. Also, for decision-making, goods were divided into essential products, high-tech products, transactions with the state and others. Αbout 500 consumers aged between 18 and 75 participated in the research. The data was processed with both descriptive statistics and econometric models. The results showed that the consumer behavior and decision-making process has changed. Now consumers widely use the internet for shopping, consumer behaviors and consumer patterns have changed. Social and economic factors play an important role. Income, gender and other factors were found to be statistically significant. In addition, it is worth noting that the percentage who made purchases during the pandemic through the internet for the first time was remarkable and related to age. Essentially, the arrival of the pandemic caused uncertainty for individuals, mainly financial, and this affected the decision-making process. In addition, shopping through the internet is now the first choice, especially among young people, and it seems that it is about to become established.

Keywords: consumer behavior, decision making, COVID-19, Greece, behavior change

Procedia PDF Downloads 46
457 Nuclear Resistance Movements: Case Study of India

Authors: Shivani Yadav

Abstract:

The paper illustrates dynamics of nuclear resistance movements in India and how peoples’ power rises in response to subversion of justice and suppression of human rights. The need for democratizing nuclear policy runs implicit through the demands of the people protesting against nuclear programmes. The paper analyses the rationale behind developing nuclear energy according to the mainstream development model adopted by the state. Whether the prevalent nuclear discourse includes people’s ambitions and addresses local concerns or not is discussed. Primarily, the nuclear movements across India comprise of two types of actors i.e. the local population as well as the urban interlocutors. The first type of actor is the local population comprising of the people who are residing in the vicinity of the nuclear site and are affected by its construction, presence and operation. They have very immediate concerns against nuclear energy projects but also have an ideological stand against producing nuclear energy. The other types of actors are the urban interlocutors, who are the intellectuals and nuclear activists who have a principled stand against nuclear energy and help to aggregate the aims and goals of the movement on various platforms. The paper focuses on the nuclear resistance movements at five sites in India- Koodankulam (Tamil Nadu), Jaitapur (Maharashtra), Haripur (West Bengal), Mithivirdi (Gujrat) and Gorakhpur (Haryana). The origin, development, role of major actors and mass media coverage of all these movements are discussed in depth. Major observations from the Indian case include: first, nuclear policy discussions in India are confined to elite circles; secondly, concepts like national security and national interest are used to suppress dissent against mainstream policies; and thirdly, India’s energy policies focus on economic concerns while ignoring the human implications of such policies. In conclusion, the paper observes that the anti-nuclear movements question not just the feasibility of nuclear power but also its exclusionary nature when it comes to people’s participation in policy making, endangering the ecology, violation of human rights, etc. The character of these protests is non-violent with an aim to produce more inclusive policy debates and democratic dialogues.

Keywords: anti-nuclear movements, Koodankulam nuclear power plant, non-violent resistance, nuclear resistance movements, social movements

Procedia PDF Downloads 148
456 Improvements and Implementation Solutions to Reduce the Computational Load for Traffic Situational Awareness with Alerts (TSAA)

Authors: Salvatore Luongo, Carlo Luongo

Abstract:

This paper discusses the implementation solutions to reduce the computational load for the Traffic Situational Awareness with Alerts (TSAA) application, based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology. In 2008, there were 23 total mid-air collisions involving general aviation fixed-wing aircraft, 6 of which were fatal leading to 21 fatalities. These collisions occurred during visual meteorological conditions, indicating the limitations of the see-and-avoid concept for mid-air collision avoidance as defined in the Federal Aviation Administration’s (FAA). The commercial aviation aircraft are already equipped with collision avoidance system called TCAS, which is based on classic transponder technology. This system dramatically reduced the number of mid-air collisions involving air transport aircraft. In general aviation, the same reduction in mid-air collisions has not occurred, so this reduction is the main objective of the TSAA application. The major difference between the original conflict detection application and the TSAA application is that the conflict detection is focused on preventing loss of separation in en-route environments. Instead TSAA is devoted to reducing the probability of mid-air collision in all phases of flight. The TSAA application increases the flight crew traffic situation awareness providing alerts of traffic that are detected in conflict with ownship in support of the see-and-avoid responsibility. The relevant effort has been spent in the design process and the code generation in order to maximize the efficiency and performances in terms of computational load and memory consumption reduction. The TSAA architecture is divided into two high-level systems: the “Threats database” and the “Conflict detector”. The first one receives the traffic data from ADS-B device and provides the memorization of the target’s data history. Conflict detector module estimates ownship and targets trajectories in order to perform the detection of possible future loss of separation between ownship and each target. Finally, the alerts are verified by additional conflict verification logic, in order to prevent possible undesirable behaviors of the alert flag. In order to reduce the computational load, a pre-check evaluation module is used. This pre-check is only a computational optimization, so the performances of the conflict detector system are not modified in terms of number of alerts detected. The pre-check module uses analytical trajectories propagation for both target and ownship. This allows major accuracy and avoids the step-by-step propagation, which requests major computational load. Furthermore, the pre-check permits to exclude the target that is certainly not a threat, using an analytical and efficient geometrical approach, in order to decrease the computational load for the following modules. This software improvement is not suggested by FAA documents, and so it is the main innovation of this work. The efficiency and efficacy of this enhancement are verified using fast-time and real-time simulations and by the execution on a real device in several FAA scenarios. The final implementation also permits the FAA software certification in compliance with DO-178B standard. The computational load reduction allows the installation of TSAA application also on devices with multiple applications and/or low capacity in terms of available memory and computational capabilities

Keywords: traffic situation awareness, general aviation, aircraft conflict detection, computational load reduction, implementation solutions, software certification

Procedia PDF Downloads 285
455 Effect of Locally Produced Sweetened Pediatric Antibiotics on Streptococcus mutans Isolated from the Oral Cavity of Pediatric Patients in Syria - in Vitro Study

Authors: Omar Nasani, Chaza Kouchaji, Muznah Alkhani, Maisaa Abd-alkareem

Abstract:

Objective: To evaluate the influence of sweetening agents used in pediatric medications on the growth of Streptococcus mutans colonies and its effect on the cariogenic activity in the oral cavity. No previous studies are registered yet in Syrian children. Methods: Specimens were isolated from the oral cavity of pediatric patients, then in-vitro study is applied on locally manufactured liquid pediatric antibiotic drugs, containing natural or synthetic sweeteners. The selected antibiotics are Ampicillin (sucrose), Amoxicillin (sucrose), Amoxicillin + Flucloxacillin (sorbitol), Amoxicillin+Clavulanic acid (Sorbitol or sucrose). These antibiotics have a known inhibitory effect on gram positive aerobic/anaerobic bacteria especially Streptococcus mutans strains in children’s oral biofilm. Five colonies are studied with each antibiotic. Saturated antibiotics were spread on a 6mm diameter filter disc. Incubated culture media were compared with each other and with the control antibiotic discs. Results were evaluated by measuring the diameter of the inhibition zones. The control group of antibiotic discs was resourced from Abtek Biologicals Ltd. Results: The diameter of inhibition zones around discs of antibiotics sweetened with sorbitol was larger than those sweetened with sucrose. The effect was most important when comparing Amoxicillin + Clavulanic Acid (sucrose 25mm; versus sorbitol 27mm). The highest inhibitory effect was observed with the usage of Amoxicillin + Flucloxacillin sweetened with sorbitol (38mm). Whereas the lowest inhibitory effect was observed with Amoxicillin and Ampicillin sweetened with sucrose (22mm and 21mm). Conclusion: The results of this study indicate that although all selected antibiotic produced an inhibitory effect on S. mutans, sucrose weakened the inhibitory action of the antibiotic to varying degrees, meanwhile antibiotic formulations containing sorbitol simulated the effects of the control antibiotic. This study calls attention to effects of sweeteners included in pediatric drugs on the oral hygiene and tooth decay.

Keywords: pediatric, dentistry, antibiotics, streptococcus mutans, biofilm, sucrose, sugar free

Procedia PDF Downloads 73
454 Data, Digital Identity and Antitrust Law: An Exploratory Study of Facebook’s Novi Digital Wallet

Authors: Wanjiku Karanja

Abstract:

Facebook has monopoly power in the social networking market. It has grown and entrenched its monopoly power through the capture of its users’ data value chains. However, antitrust law’s consumer welfare roots have prevented it from effectively addressing the role of data capture in Facebook’s market dominance. These regulatory blind spots are augmented in Facebook’s proposed Diem cryptocurrency project and its Novi Digital wallet. Novi, which is Diem’s digital identity component, shall enable Facebook to collect an unprecedented volume of consumer data. Consequently, Novi has seismic implications on internet identity as the network effects of Facebook’s large user base could establish it as the de facto internet identity layer. Moreover, the large tracts of data Facebook shall collect through Novi shall further entrench Facebook's market power. As such, the attendant lock-in effects of this project shall be very difficult to reverse. Urgent regulatory action is therefore required to prevent this expansion of Facebook’s data resources and monopoly power. This research thus highlights the importance of data capture to competition and market health in the social networking industry. It utilizes interviews with key experts to empirically interrogate the impact of Facebook’s data capture and control of its users’ data value chains on its market power. This inquiry is contextualized against Novi’s expansive effect on Facebook’s data value chains. It thus addresses the novel antitrust issues arising at the nexus of Facebook’s monopoly power and the privacy of its users’ data. It also explores the impact of platform design principles, specifically data portability and data portability, in mitigating Facebook’s anti-competitive practices. As such, this study finds that Facebook is a powerful monopoly that dominates the social media industry to the detriment of potential competitors. Facebook derives its power from its size, annexure of the consumer data value chain, and control of its users’ social graphs. Additionally, the platform design principles of data interoperability and data portability are not a panacea to restoring competition in the social networking market. Their success depends on the establishment of robust technical standards and regulatory frameworks.

Keywords: antitrust law, data protection law, data portability, data interoperability, digital identity, Facebook

Procedia PDF Downloads 123
453 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators

Authors: K. O'Malley

Abstract:

Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.

Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university

Procedia PDF Downloads 32
452 Piql Preservation Services - A Holistic Approach to Digital Long-Term Preservation

Authors: Alexander Rych

Abstract:

Piql Preservation Services (“Piql”) is a turnkey solution designed for secure, migration-free long- term preservation of digital data. Piql sets an open standard for long- term preservation for the future. It consists of equipment and processes needed for writing and retrieving digital data. Exponentially growing amounts of data demand for logistically effective and cost effective processes. Digital storage media (hard disks, magnetic tape) exhibit limited lifetime. Repetitive data migration to overcome rapid obsolescence of hardware and software bears accelerated risk of data loss, data corruption or even manipulation and adds significant repetitive costs for hardware and software investments. Piql stores any kind of data in its digital as well as analog form securely for 500 years. The medium that provides this is a film reel. Using photosensitive film polyester base, a very stable material that is known for its immutability over hundreds of years, secure and cost-effective long- term preservation can be provided. The film reel itself is stored in a packaging capable of protecting the optical storage medium. These components have undergone extensive testing to ensure longevity of up to 500 years. In addition to its durability, film is a true WORM (write once- read many) medium. It therefore is resistant to editing or manipulation. Being able to store any form of data onto the film makes Piql a superior solution for long-term preservation. Paper documents, images, video or audio sequences – all of those file formats and documents can be preserved in its native file structure. In order to restore the encoded digital data, only a film scanner, a digital camera or any appropriate optical reading device will be needed in the future. Every film reel includes an index section describing the data saved on the film. It also contains a content section carrying meta-data, enabling users in the future to rebuild software in order to read and decode the digital information.

Keywords: digital data, long-term preservation, migration-free, photosensitive film

Procedia PDF Downloads 392
451 Digitial Communication – The Future of Chronic Disease Management Is Healthcare Apps

Authors: Kirstin Griffin

Abstract:

During a period of increased anxiety and stress, communication became the essential tool to help the public stay informed and feel prepared during the Covid-19 pandemic. However, certain groups of patients were not feeling as reassured. The news and media blasted the message that patients with diabetes were “high-risk" in regards to contracting the Covid-19 infection. Routine clinics were being cancelled, GP practices were closing their doors, and patients with type 1 diabetes were understandably scared. The influx of calls to diabetes specialists nurses from concerned patients highlighted the need for better and more specialised information. An Application specifically for patients with type 1 diabetes was created to deliver this information, and it proved to be the essential communication tool that was desperately needed. The Application for patients with type 1 diabetes aimed to deliver specialist information to patients in regards to their diagnosis, management, and ongoing follow-up commitments. The Application gives practical advice on multiple areas of diabetes management, including sick-day rules and diabetic emergencies, as well as up-to-date information on technology, including setting up Libre devices and downloading glucose meters to facilitate attending virtual clinics. Delivery of this information in an easy-to-understand and comprehensive way is intended to improve patient engagement with diabetes services and ultimately empower patients in the control of their own disease. The application also offers a messaging service to allow the diabetes team to send out alerts to patient groups on specific issues, such as changes to clinics, or respond to recent news updates regarding Covid-19. The App was launched in NHS Fife in June 2020 and has amassed 800 active users so far. There is growing engagement with the App since its launch, with over 1000 user interactions in the last month alone. Feedback shows that 100% of users like the App and have found it useful in the management of their diabetes. The App has proven to be an essential tool in communication with one of the most vulnerable groups during the Covid-19 pandemic, and its ongoing development will continue to increase patient engagement and improve glycaemic control for patients with type 1 diabetes. The future of chronic disease management should involve digital solutions such as apps to further empower patients in their healthcare.

Keywords: diabetes, endocrinology, digital healthcare, medical apps

Procedia PDF Downloads 87
450 Colloid-Based Biodetection at Aqueous Electrical Interfaces Using Fluidic Dielectrophoresis

Authors: Francesca Crivellari, Nicholas Mavrogiannis, Zachary Gagnon

Abstract:

Portable diagnostic methods have become increasingly important for a number of different purposes: point-of-care screening in developing nations, environmental contamination studies, bio/chemical warfare agent detection, and end-user use for commercial health monitoring. The cheapest and most portable methods currently available are paper-based – lateral flow and dipstick methods are widely available in drug stores for use in pregnancy detection and blood glucose monitoring. These tests are successful because they are cheap to produce, easy to use, and require minimally invasive sampling. While adequate for their intended uses, in the realm of blood-borne pathogens and numerous cancers, these paper-based methods become unreliable, as they lack the nM/pM sensitivity currently achieved by clinical diagnostic methods. Clinical diagnostics, however, utilize techniques involving surface plasmon resonance (SPR) and enzyme-linked immunosorbent assays (ELISAs), which are expensive and unfeasible in terms of portability. To develop a better, competitive biosensor, we must reduce the cost of one, or increase the sensitivity of the other. Electric fields are commonly utilized in microfluidic devices to manipulate particles, biomolecules, and cells. Applications in this area, however, are primarily limited to interfaces formed between immiscible interfaces. Miscible, liquid-liquid interfaces are common in microfluidic devices, and are easily reproduced with simple geometries. Here, we demonstrate the use of electrical fields at liquid-liquid electrical interfaces, known as fluidic dielectrophoresis, (fDEP) for biodetection in a microfluidic device. In this work, we apply an AC electric field across concurrent laminar streams with differing conductivities and permittivities to polarize the interface and induce a discernible, near-immediate, frequency-dependent interfacial tilt. We design this aqueous electrical interface, which becomes the biosensing “substrate,” to be intelligent – it “moves” only when a target of interest is present. This motion requires neither labels nor expensive electrical equipment, so the biosensor is inexpensive and portable, yet still capable of sensitive detection. Nanoparticles, due to their high surface-area-to-volume ratio, are often incorporated to enhance detection capabilities of schemes like SPR and fluorimetric assays. Most studies currently investigate binding at an immobilized solid-liquid or solid-gas interface, where particles are adsorbed onto a planar surface, functionalized with a receptor to create a reactive substrate, and subsequently flushed with a fluid or gas with the relevant analyte. These typically involve many preparation and rinsing steps, and are susceptible to surface fouling. Our microfluidic device is continuously flowing and renewing the “substrate,” and is thus not subject to fouling. In this work, we demonstrate the ability to electrokinetically detect biomolecules binding to functionalized nanoparticles at liquid-liquid interfaces using fDEP. In biotin-streptavidin experiments, we report binding detection limits on the order of 1-10 pM, without amplifying signals or concentrating samples. We also demonstrate the ability to detect this interfacial motion, and thus the presence of binding, using impedance spectroscopy, allowing this scheme to become non-optical, in addition to being label-free.

Keywords: biodetection, dielectrophoresis, microfluidics, nanoparticles

Procedia PDF Downloads 388
449 Process of the Emergence and Evolution of Socio-Cultural Ideas about the "Asian States" In the Context of the Development of US Cinema in 1941-1945

Authors: Selifontova Darya Yurievna

Abstract:

The study of the process of the emergence and evolution of socio-cultural ideas about the "Asian states" in the context of the development of US cinema in 1941-1945 will contribute both to the approbation of a new approach to the classical subject and will allow using the methodological tools of history, political science, philology, sociology for understanding modern military-political, historical, ideological, socio-cultural processes on a concrete example. This is especially important for understanding the process of constructing the image of the Japanese Empire in the USA. Assessments and images of China and Japan in World War II, created in American cinema, had an immediate impact on the media, public sentiment, and opinions. During the war, the US cinema created new myths and actively exploited old ones, combining them with traditional Hollywood cliches - all this served as a basis for creating the image of China and the Japanese Empire on the screen, which were necessary to solve many foreign policy and domestic political tasks related to the construction of two completely different, but at the same time, similar images of Asia (China and the Japanese Empire). In modern studies devoted to the history of wars, the study of the specifics of the information confrontation of the parties is in demand. A special role in this confrontation is played by propaganda through cinema, which uses images, historical symbols, and stable metaphors, the appeal to which can form a certain public reaction. Soviet documentaries of the war years are proof of this. The relevance of the topic is due to the fact that cinema as a means of propaganda was very popular and in demand during the Second World War. This period was the time of creation of real masterpieces in the field of propaganda films, in the documentary space of the cinema of 1941 – 1945. The traditions of depicting the Second World War were laid down. The study of the peculiarities of visualization and mythologization of the Second World War in Soviet cinema is the most important stage for studying the development of the specifics of propaganda methods since the methods and techniques of depicting the war formed in 1941-1945 are also significant at the present stage of the study of society.

Keywords: asian countries, politics, sociology, domestic politics, USA, cinema

Procedia PDF Downloads 127
448 The Risk and Prevention of Peer-To-Peer Network Lending in China

Authors: Zhizhong Yuan, Lili Wang, Chenya Zheng, Wuqi Yang

Abstract:

How to encourage and support peer-to-peer (P2P) network lending, and effectively monitor the risk of P2P network lending, has become the focus of the Chinese government departments, industrialists, experts and scholars in recent years. The reason is that this convenient online micro-credit service brings a series of credit risks and other issues. Avoiding the risks brought by the P2P network lending model, it can better play a benign role and help China's small and medium-sized private enterprises with vigorous development to solve the capital needs; otherwise, it will bring confusion to the normal financial order. As a form of financial services, P2P network lending has injected new blood into China's non-government finance in the past ten years, and has found a way out for idle funds and made up for the shortage of traditional financial services in China. However, it lacks feasible measures in credit evaluation and government supervision. This paper collects a large amount of data about P2P network lending of China. The data collection comes from the official media of the Chinese government, the public achievements of existing researchers and the analysis and collation of correlation data by the authors. The research content of this paper includes literature review; the current situation of China's P2P network lending development; the risk analysis of P2P network lending in China; the risk prevention strategy of P2P network lending in China. The focus of this paper is to try to find a specific program to strengthen supervision and avoid risks from the perspective of government regulators, operators of P2P network lending platform, investors and users of funds. These main measures include: China needs to develop self-discipline organization of P2P network lending industry and formulate self-discipline norms as soon as possible; establish a regular information disclosure system of P2P network lending platform; establish censorship of credit rating of borrowers; rectify the P2P network lending platform in compliance through the implementation of bank deposition. The results and solutions will benefit all the P2P network lending platforms, creditors, debtors, bankers, independent auditors and government agencies of China and other countries.

Keywords: peer-to-peer(P2P), regulation, risk prevention, supervision

Procedia PDF Downloads 166
447 Netnography Research in Leisure, Tourism, and Hospitality: Lessons from Research and Education

Authors: Marisa P. De Brito

Abstract:

The internet is affecting the way the industry operates and communicates. It is also becoming a customary means for leisure, tourism, and hospitality consumers to seek and exchange information and views on hotels, destinations events and attractions, or to develop social ties with other users. On the one hand, the internet is a rich field to conduct leisure, tourism, and hospitality research; on the other hand, however, there are few researchers formally embracing online methods of research, such as netnography. Within social sciences, netnography falls under the interpretative/ethnographic research methods umbrella. It is an adaptation of anthropological techniques such as participant and non-participant observation, used to study online interactions happening on social media platforms, such as Facebook. It is, therefore, a research method applied to the study of online communities, being the term itself a contraction of the words network (as on internet), and ethnography. It was developed in the context of marketing research in the nineties, and in the last twenty years, it has spread to other contexts such as education, psychology, or urban studies. Since netnography is not universally known, it may discourage researchers and educators from using it. This work offers guidelines for researchers wanting to apply this method in the field of leisure, tourism, and hospitality or for educators wanting to teach about it. This is done by means of a double approach: a content analysis of the literature side-by-side with educational data, on the use of netnography. The content analysis is of the incidental research using netnography in leisure, tourism, and hospitality in the last twenty years. The educational data is the author and her colleagues’ experience in coaching students throughout the process of writing a paper using primary netnographic data - from identifying the phenomenon to be studied, selecting an online community, collecting and analyzing data to writing their findings. In the end, this work puts forward, on the one hand, a research agenda, and on the other hand, an educational roadmap for those wanting to apply netnography in the field or the classroom. The educator’s roadmap will summarise what can be expected from mini-netnographies conducted by students and how to set it up. The research agenda will highlight for which issues and research questions the method is most suitable; what are the most common bottlenecks and drawbacks of the method and of its application, but also where most knowledge opportunities lay.

Keywords: netnography, online research, research agenda, educator's roadmap

Procedia PDF Downloads 184
446 Atmospheric Polycyclic Aromatic Hydrocarbons (PAHs) in Rural and Urban of Central Taiwan

Authors: Shih Yu Pan, Pao Chen Hung, Chuan Yao Lin, Charles C.-K. Chou, Yu Chi Lin, Kai Hsien Chi

Abstract:

This study analyzed 16 atmospheric PAHs species which were controlled by USEPA and IARC. To measure the concentration of PAHs, four rural sampling sites and two urban sampling sites were selected in Central Taiwan during spring and summer. In central Taiwan, the rural sampling stations were located in the downstream of Da-An River, Da-Jang River, Wu River and Chuo-shui River. On the other hand, the urban sampling sites were located in Taichung district and close to the roadside. Ambient air samples of both vapor phase and particle phase of PAHs compounds were collected using high volume sampling trains (Analitica). The sampling media were polyurethane foam (PUF) with XAD2 and quartz fiber filters. Diagnostic ratio, Principal component analysis (PCA), Positive Matrix Factorization (PMF) models were used to evaluate the apportionment of PAHs in the atmosphere and speculate the relative contribution of various emission sources. Because of the high temperature and low wind speed, high PAHs concentration in the atmosphere was observed. The total PAHs concentration, especially in vapor phase, had significant change during summer. During the sampling periods the total PAHs concentration of atmospheric at four rural and two urban sampling sites in spring and summer were 3.70±0.40 ng/m3,3.40±0.63 ng/m3,5.22±1.24 ng/m3,7.23±0.37 ng/m3,7.46±2.36 ng/m3,6.21±0.55 ng/m3 ; 15.0± 0.14 ng/m3,18.8±8.05 ng/m3,20.2±8.58 ng/m3,16.1±3.75 ng/m3,29.8±10.4 ng/m3,35.3±11.8 ng/m3, respectively. In order to identify PAHs sources, we used diagnostic ratio to classify the emission sources. The potential sources were diesel combustion and gasoline combustion in spring and summer, respectively. According to the principal component analysis (PCA), the PC1 and PC2 had 23.8%, 20.4% variance and 21.3%, 17.1% variance in spring and summer, respectively. Especially high molecular weight PAHs (BaP, IND, BghiP, Flu, Phe, Flt, Pyr) were dominated in spring when low molecular weight PAHs (AcPy, Ant, Acp, Flu) because of the dominating high temperatures were dominated in the summer. Analysis by using PMF model found the sources of PAHs in spring were stationary sources (34%), vehicle emissions (24%), coal combustion (23%) and petrochemical fuel gas (19%), while in summer the emission sources were petrochemical fuel gas (34%), the natural environment of volatile organic compounds (29%), coal combustion (19%) and stationary sources (18%).

Keywords: PAHs, source identification, diagnostic ratio, principal component analysis, positive matrix factorization

Procedia PDF Downloads 267
445 Digital Adoption of Sales Support Tools for Farmers: A Technology Organization Environment Framework Analysis

Authors: Sylvie Michel, François Cocula

Abstract:

Digital agriculture is an approach that exploits information and communication technologies. These encompass data acquisition tools like mobile applications, satellites, sensors, connected devices, and smartphones. Additionally, it involves transfer and storage technologies such as 3G/4G coverage, low-bandwidth terrestrial or satellite networks, and cloud-based systems. Furthermore, embedded or remote processing technologies, including drones and robots for process automation, along with high-speed communication networks accessible through supercomputers, are integral components of this approach. While farm-level adoption studies regarding digital agricultural technologies have emerged in recent years, they remain relatively limited in comparison to other agricultural practices. To bridge this gap, this study delves into understanding farmers' intention to adopt digital tools, employing the technology, organization, environment framework. A qualitative research design encompassed semi-structured interviews, totaling fifteen in number, conducted with key stakeholders both prior to and following the 2020-2021 COVID-19 lockdowns in France. Subsequently, the interview transcripts underwent thorough thematic content analysis, and the data and verbatim were triangulated for validation. A coding process aimed to systematically organize the data, ensuring an orderly and structured classification. Our research extends its contribution by delineating sub-dimensions within each primary dimension. A total of nine sub-dimensions were identified, categorized as follows: perceived usefulness for communication, perceived usefulness for productivity, and perceived ease of use constitute the first dimension; technological resources, financial resources, and human capabilities constitute the second dimension, while market pressure, institutional pressure, and the COVID-19 situation constitute the third dimension. Furthermore, this analysis enriches the TOE framework by incorporating entrepreneurial orientation as a moderating variable. Managerial orientation emerges as a pivotal factor influencing adoption intention, with producers acknowledging the significance of utilizing digital sales support tools to combat "greenwashing" and elevate their overall brand image. Specifically, it illustrates that producers recognize the potential of digital tools in time-saving and streamlining sales processes, leading to heightened productivity. Moreover, it highlights that the intent to adopt digital sales support tools is influenced by a market mimicry effect. Additionally, it demonstrates a negative association between the intent to adopt these tools and the pressure exerted by institutional partners. Finally, this research establishes a positive link between the intent to adopt digital sales support tools and economic fluctuations, notably during the COVID-19 pandemic. The adoption of sales support tools in agriculture is a multifaceted challenge encompassing three dimensions and nine sub-dimensions. The research delves into the adoption of digital farming technologies at the farm level through the TOE framework. This analysis provides significant insights beneficial for policymakers, stakeholders, and farmers. These insights are instrumental in making informed decisions to facilitate a successful digital transition in agriculture, effectively addressing sector-specific challenges.

Keywords: adoption, digital agriculture, e-commerce, TOE framework

Procedia PDF Downloads 60
444 Public Art as Social Critique to Shape Urban-Scape

Authors: Po-Ching Wang

Abstract:

Public art may be regarded as a social agenda. It is assumed that public art acts as an intermediate form that contributes significantly to community resurgence. That is, public art may be regarded as a verb/process or social intervention. It functions as a vanguard form, attacking boundaries and providing a sensibility for social strategy. Public art in tradition is generally expected to bring aesthetic pleasure to public. Contemporary public art, however, not only focuses on art installation, but it also often offers a process that aims to comment on, question, and challenge the socio-cultural status quo. During the last few decades, accelerated changes in the values and expectations brought to bear on varied urban issues, together with the destruction of the hegemony of traditional art and of museum authorities, has begun to contribute to freer and more democratic representations of public art. It is said that part of a public artwork’s role is to ruffle sacred feathers. In many cases, public art is created to address the dynamic social contradictions and mutability of public life; and artists and community participants approach public art from a variety of social critical perspectives and methodologies. Urban issues, such as social and environmental justice, health problems, violence, and political statements, provide plentiful source materials that fuel the performance of public art in many different settings. Further, public artworks have been extensively adopted to express social identity, make political statements, and/or to remedy social and environmental crises. Many murals on urban walls, for instance, reflect social conflicts and address civic rights, and these projects are usually the work of artists who though denied access to traditional gallery and museum channels are supported by community engagement and involvement. Public art as a social practice challenges the traditional western view of artistic practice. Art in the public realm creates a new media that provides a platform for a dialogical exchange between diverse social groups. It seems that public art has evolved as an arena for activism that addresses wide-ranging and highly controversial social issues and civilian concerns. The findings of this study indicate that public artworks are capable of playing a role of activist in facilitating community evolution via social progress.

Keywords: aesthetics, community regeneration, city development, publicness, public participation, social progress

Procedia PDF Downloads 230
443 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo

Authors: Margaret Boone Rappaport, Christopher J. Corbally

Abstract:

The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.

Keywords: genetic drift, genomics, parietal expansion, religious capacity

Procedia PDF Downloads 341
442 An International Curriculum Development for Languages and Technology

Authors: Miguel Nino

Abstract:

When considering the challenges of a changing and demanding globalizing world, it is important to reflect on how university students will be prepared for the realities of internationalization, marketization and intercultural conversation. The present study is an interdisciplinary program designed to respond to the needs of the global community. The proposal bridges the humanities and science through three different fields: Languages, graphic design and computer science, specifically, fundamentals of programming such as python, java script and software animation. Therefore, the goal of the four year program is twofold: First, enable students for intercultural communication between English and other languages such as Spanish, Mandarin, French or German. Second, students will acquire knowledge in practical software and relevant employable skills to collaborate in assisted computer projects that most probable will require essential programing background in interpreted or compiled languages. In order to become inclusive and constructivist, the cognitive linguistics approach is suggested for the three different fields, particularly for languages that rely on the traditional method of repetition. This methodology will help students develop their creativity and encourage them to become independent problem solving individuals, as languages enhance their common ground of interaction for culture and technology. Participants in this course of study will be evaluated in their second language acquisition at the Intermediate-High level. For graphic design and computer science students will apply their creative digital skills, as well as their critical thinking skills learned from the cognitive linguistics approach, to collaborate on a group project design to find solutions for media web design problems or marketing experimentation for a company or the community. It is understood that it will be necessary to apply programming knowledge and skills to deliver the final product. In conclusion, the program equips students with linguistics knowledge and skills to be competent in intercultural communication, where English, the lingua franca, remains the medium for marketing and product delivery. In addition to their employability, students can expand their knowledge and skills in digital humanities, computational linguistics, or increase their portfolio in advertising and marketing. These students will be the global human capital for the competitive globalizing community.

Keywords: curriculum, international, languages, technology

Procedia PDF Downloads 443
441 Developing and integrated Clinical Risk Management Model

Authors: Mohammad H. Yarmohammadian, Fatemeh Rezaei

Abstract:

Introduction: Improving patient safety in health systems is one of the main priorities in healthcare systems, so clinical risk management in organizations has become increasingly significant. Although several tools have been developed for clinical risk management, each has its own limitations. Aims: This study aims to develop a comprehensive tool that can complete the limitations of each risk assessment and management tools with the advantage of other tools. Methods: Procedure was determined in two main stages included development of an initial model during meetings with the professors and literature review, then implementation and verification of final model. Subjects and Methods: This study is a quantitative − qualitative research. In terms of qualitative dimension, method of focus groups with inductive approach is used. To evaluate the results of the qualitative study, quantitative assessment of the two parts of the fourth phase and seven phases of the research was conducted. Purposive and stratification sampling of various responsible teams for the selected process was conducted in the operating room. Final model verified in eight phases through application of activity breakdown structure, failure mode and effects analysis (FMEA), healthcare risk priority number (RPN), root cause analysis (RCA), FT, and Eindhoven Classification model (ECM) tools. This model has been conducted typically on patients admitted in a day-clinic ward of a public hospital for surgery in October 2012 to June. Statistical Analysis Used: Qualitative data analysis was done through content analysis and quantitative analysis done through checklist and edited RPN tables. Results: After verification the final model in eight-step, patient's admission process for surgery was developed by focus discussion group (FDG) members in five main phases. Then with adopted methodology of FMEA, 85 failure modes along with its causes, effects, and preventive capabilities was set in the tables. Developed tables to calculate RPN index contain three criteria for severity, two criteria for probability, and two criteria for preventability. Tree failure modes were above determined significant risk limitation (RPN > 250). After a 3-month period, patient's misidentification incidents were the most frequent reported events. Each RPN criterion of misidentification events compared and found that various RPN number for tree misidentification reported events could be determine against predicted score in previous phase. Identified root causes through fault tree categorized with ECM. Wrong side surgery event was selected by focus discussion group to purpose improvement action. The most important causes were lack of planning for number and priority of surgical procedures. After prioritization of the suggested interventions, computerized registration system in health information system (HIS) was adopted to prepare the action plan in the final phase. Conclusion: Complexity of health care industry requires risk managers to have a multifaceted vision. Therefore, applying only one of retrospective or prospective tools for risk management does not work and each organization must provide conditions for potential application of these methods in its organization. The results of this study showed that the integrated clinical risk management model can be used in hospitals as an efficient tool in order to improve clinical governance.

Keywords: failure modes and effective analysis, risk management, root cause analysis, model

Procedia PDF Downloads 249
440 Levels of Heavy Metals and Arsenic in Sediment and in Clarias Gariepinus, of Lake Ngami

Authors: Nashaat Mazrui, Oarabile Mogobe, Barbara Ngwenya, Ketlhatlogile Mosepele, Mangaliso Gondwe

Abstract:

Over the last several decades, the world has seen a rapid increase in activities such as deforestation, agriculture, and energy use. Subsequently, trace elements are being deposited into our water bodies, where they can accumulate to toxic levels in aquatic organisms and can be transferred to humans through fish consumption. Thus, though fish is a good source of essential minerals and omega-3 fatty acids, it can also be a source of toxic elements. Monitoring trace elements in fish is important for the proper management of aquatic systems and the protection of human health. The aim of this study was to determine concentrations of trace elements in sediment and muscle tissues of Clarias gariepinus at Lake Ngami, in the Okavango Delta in northern Botswana, during low floods. The fish were bought from local fishermen, and samples of muscle tissue were acid-digested and analyzed for iron, zinc, copper, manganese, molybdenum, nickel, chromium, cadmium, lead, and arsenic using inductively coupled plasma optical emission spectroscopy (ICP-OES). Sediment samples were also collected and analyzed for the elements and for organic matter content. Results show that in all samples, iron was found in the greatest amount while cadmium was below the detection limit. Generally, the concentrations of elements in sediment were higher than in fish except for zinc and arsenic. While the concentration of zinc was similar in the two media, arsenic was almost 3 times higher in fish than sediment. To evaluate the risk to human health from fish consumption, the target hazard quotient (THQ) and cancer risk for an average adult in Botswana, sub-Saharan Africa, and riparian communities in the Okavango Delta was calculated for each element. All elements were found to be well below regulatory limits and do not pose a threat to human health except arsenic. The results suggest that other benthic feeding fish species could potentially have high arsenic levels too. This has serious implications for human health, especially riparian households to whom fish is a key component of food and nutrition security.

Keywords: Arsenic, African sharp tooth cat fish, Okavango delta, trace elements

Procedia PDF Downloads 192
439 Comparison between RILM, JSTOR, and WorldCat Used to Search for Secondary Literature

Authors: Stacy Jarvis

Abstract:

Databases such as JSTOR, RILM and WorldCat have been the main source and storage of literature in the music orb. The Reference Index to Music Literature is a bibliographic database of over 2.6 million citations to writings about music from over 70 countries. The Research Institute produces RILM for the Study of Music at the University of Buffalo. JSTOR is an e-library of academic journals, books, and primary sources. Database JSTOR helps scholars find, utilise, and build upon a vast range of literature through a powerful teaching and research platform. Another database, WorldCat, is the world's biggest library catalogue, assisting scholars in finding library materials online. An evaluation of these databases in the music sphere is conducted by looking into the description and intended use and finding similarities and differences among them. Through comparison, it is found that these aim to serve different purposes, though they have the same goal of providing and storing literature. Also, since each database has different parts of literature that it majors on, the intended use of the three databases is evaluated. This can be found in the description, scope, and intended uses section. These areas are crucial to the research as it addresses the functional or literature differences among the three databases. It is also found that these databases have different quantitative potentials. This is determined by addressing the year each database began collecting literature and the number of articles, periodicals, albums, conference proceedings, music, dissertations, digital media, essays collections, journal articles, monographs, online resources, reviews, and reference materials that can be found in each one of them. This can be found in the sections- description, scope and intended uses and the importance of the database in identifying literature on different topics. To compare the delivery of services to the users, the importance of databases in identifying literature on different topics is also addressed in the section -the importance of databases in identifying literature on different topics. Even though these databases are used in research, they all have disadvantages and advantages. This is addressed in the sections on advantages and disadvantages. This will be significant in determining which of the three is the best. Also, it will help address how the shortcomings of one database can be addressed by utilising two databases together while conducting research. It is addressed in the section- a combination of RILM and JSTOR. All this information revolves around the idea that a huge amount of quantitative and qualitative data can be found in the presented databases on music and digital content; however, each of the given databases has a different construction and material features contributing to the musical scholarship in its way.

Keywords: RILM, JSTOR, WorldCat, database, literature, research

Procedia PDF Downloads 83
438 Enhancement of Morphogenetic Potential to Obtain Elite Varities of Sauropus androgynous (L.) Merr. through Somatic Embryogenesis

Authors: S. Padma, D. H. Tejavathi

Abstract:

Somatic embryogenesis is a remarkable illustration of the dictum of plant totipotency where developmental reconstruction of somatic cells takes place towards the embryogenic pathway. It recapitulates the morphological and developmental process that occurs in zygotic embryogenesis. S. androgynous commonly called as multivitamin plant. The leaves are consumed as green leafy vegetable by the Southeast Asian communities due to their rich nutritional profile. Despite being a good nutritional vegetable with proteins, vitamins, minerals, amino acids, it is warned for excessive intake due to the presence of alkoloid called papaverine. Papaverine at higher concentrations is toxic and leads to a syndrome called Bronchiolitis Obliterans. In the present study, morphogenetic potential of shoot tip, leaf and nodal explants of Sauropus androgynous was investigated to develop and enhance the reliable plant regeneration protocol via somatic embryogenesis. Somatic embryos were derived directly from the embryogenic callus derived from shoot tip, node and leaf cultures on Phillips and Collins (L2) medium supplemented with NAA at various concentrations ranging from 5.3 µM/l to 26.85 µM/l within two months of inoculation. Thus obtained embryos were sub cultured to modified L2 media supplemented with increased vitamin level for the further growth. Somatic embryos with well-developed cotyledons were transferred to normal and modified L2 basal medium for conversion. The plantlets thus obtained were subjected to brief acclimatization before transferring them to land. About 95% of survival rate was recorded. The augmentation process of culturing various explants through somatic embryogenesis using synthetic medium with various plant growth regulators under controlled conditions have aggrandized the commercial production of Sauropus making it easily available over the conventional propagation methods. In addition, regeneration process through somatic embryogenesis has ameliorated the development of desired character in Sauropus with low papaverine content thereby providing a valuable resource to the food and pharmaceutical industry. Based on this research, plant tissue culture techniques have shown promise for economical and convenient application in Sauropus androgynous breeding.

Keywords: L2 medium, multivitamin plant, NAA, papaverine

Procedia PDF Downloads 207
437 Moodle-Based E-Learning Course Development for Medical Interpreters

Authors: Naoko Ono, Junko Kato

Abstract:

According to the Ministry of Justice, 9,044,000 foreigners visited Japan in 2010. The number of foreign residents in Japan was over 2,134,000 at the end of 2010. Further, medical tourism has emerged as a new area of business. Against this background, language barriers put the health of foreigners in Japan at risk, because they have difficulty in accessing health care and communicating with medical professionals. Medical interpreting training is urgently needed in response to language problems resulting from the rapid increase in the number of foreign workers in Japan over recent decades. Especially, there is a growing need in medical settings in Japan to speak international languages for communication, with Tokyo selected as the host city of the 2020 Summer Olympics. Due to the limited number of practical activities on medical interpreting, it is difficult for learners to acquire the interpreting skills. In order to eliminate the shortcoming, a web-based English-Japanese medical interpreting training system was developed. We conducted a literature review to identify learning contents, core competencies for medical interpreters by using Pubmed, PsycINFO, Cochrane Library, and Google Scholar. Selected papers were investigated to find core competencies in medical interpreting. Eleven papers were selected through literature review indicating core competencies for medical interpreters. Core competencies in medical interpreting abstracted from the literature review, showed consistency in previous research whilst the content of the programs varied in domestic and international training programs for medical interpreters. Results of the systematic review indicated five core competencies: (a) maintaining accuracy and completeness; (b) medical terminology and understanding the human body; (c) behaving ethically and making ethical decisions; (d) nonverbal communication skills; and (e) cross-cultural communication skills. We developed an e-leaning program for training medical interpreters. A Web-based Medical Interpreter Training Program which cover these competencies was developed. The program included the following : online word list (Quizlet), allowing student to study online and on their smartphones; self-study tool (Quizlet) for help with dictation and spelling; word quiz (Quizlet); test-generating system (Quizlet); Interactive body game (BBC);Online resource for understanding code of ethics in medical interpreting; Webinar about non-verbal communication; and Webinar about incompetent vs. competent cultural care. The design of a virtual environment allows the execution of complementary experimental exercises for learners of medical interpreting and introduction to theoretical background of medical interpreting. Since this system adopts a self-learning style, it might improve the time and lack of teaching material restrictions of the classroom method. In addition, as a teaching aid, virtual medical interpreting is a powerful resource for the understanding how actual medical interpreting can be carried out. The developed e-learning system allows remote access, enabling students to perform experiments at their own place, without being physically in the actual laboratory. The web-based virtual environment empowers students by granting them access to laboratories during their free time. A practical example will be presented in order to show capabilities of the system. The developed web-based training program for medical interpreters could bridge the gap between medical professionals and patients with limited English proficiency.

Keywords: e-learning, language education, moodle, medical interpreting

Procedia PDF Downloads 366
436 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach

Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika

Abstract:

Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.

Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments

Procedia PDF Downloads 219
435 A Comparative Study of the Alternatives to Land Acquisition: India

Authors: Aparna Soni

Abstract:

The much-celebrated foretold story of Indian city engines driving the growth of India has been scrutinized to have serious consequences. A wide spectrum of scholarship has brought to light the un-equalizing effects and the need to adopt a rights-based approach to development planning in India. Notably, these concepts and discourses ubiquitously entail the study of land struggles in the making of Urban. In fact, the very progression of the primitive accumulation theory to accumulation by dispossession, followed by ‘dispossession without development,’ thereafter Development without dispossession and now as Dispossession by financialization noticeably the last three developing in a span of mere three decades, is evidence enough to trace the centrality and evolving role of land in the making of urban India. India, in the last decade, has seen its regional governments actively experimenting with alternative models of land assembly (Amaravati and Delhi land pooling models, the loudly advertised ones). These are publicized as a replacement to the presumably cost and time antagonistic, prone to litigation land acquisition act of 2013. It has been observed that most of the literature treats these models as a generic large bracket of land expropriation and do not, in particular, try to differentially analyse to granularly find a pattern in these alternatives. To cater to this gap, this research comparatively studies these alternative land, assembly models. It categorises them based on their basic architecture, spatial and sectoral application, and governance frameworks. It is found that these alternatives are ad-hoc and fragmented pieces of legislation. These are fit for profit models commodifying land to ease its access by the private sector for real estate led growth. The research augments the literature on the privatization of land use planning in India. Further, it attempts to discuss the increasing role a landowner is expected to play in the future and suggests a way forward to safeguard them from market risks. The study involves a thematic analysis of the policy elements contained in legislative/policy documents, notifications, office orders. The study also derives from the various widely circulated print media information. With the present field-visit limitations, the study relies on documents accessed open-source in the public domain.

Keywords: commodification, dispossession, land acquisition, landowner

Procedia PDF Downloads 166
434 The Relationship between the Content of Inner Human Experience and Well-Being: An Experience Sampling Study

Authors: Xinqi Guo, Karen R. Dobkins

Abstract:

Background and Objectives: Humans are probably the only animals whose minds are constantly filled with thoughts, feelings and emotions. Previous studies have investigated human minds from different dimensions, including its proportion of time for not being present, its representative format, its personal relevance, its temporal locus, and affect valence. The current study aims at characterizing human mind by employing Experience Sampling Methods (ESM), a self-report research procedure for studying daily experience. This study emphasis on answering the following questions: 1) How does the contents of the inner experience vary across demographics, 2) Are certain types of inner experiences correlated with level of mindfulness and mental well-being (e.g., are people who spend more time being present happier, and are more mindful people more at-present?), 3) Will being prompted to report one’s inner experience increase mindfulness and mental well-being? Methods: Participants were recruited from the subject pool of UC San Diego or from the social media. They began by filling out two questionnaires: 1) Five Facet Mindfulness Questionnaire-Short Form, and 2) Warwick-Edinburgh Mental Well-being Scale, and demographic information. Then they participated in the ESM part by responding to the prompts which contained questions about their real-time inner experience: if they were 'at-present', 'mind-wandering', or 'zoned-out'. The temporal locus, the clarity, and the affect valence, and the personal importance of the thought they had the moment before the prompt were also assessed. A mobile app 'RealLife Exp' randomly delivered these prompts 3 times/day for 6 days during wake-time. After the 6 days, participants completed questionnaire (1) and (2) again. Their changes of score were compared to a control group who did not participate in the ESM procedure (yet completed (1) and (2) one week apart). Results: Results are currently preliminary as we continue to collect data. So far, there is a trend that participants are present, mind-wandering and zoned-out, about 53%, 23% and 24% during wake-time, respectively. The thoughts of participants are ranked to be clearer and more neutral if they are present vs. mind-wandering. Mind-wandering thoughts are 66% about the past, consisting 80% of inner speech. Discussion and Conclusion: This study investigated the subjective account of human mind by a tool with high ecological validity. And it broadens the understanding of the relationship between contents of mind and well-being.

Keywords: experience sampling method, meta-memory, mindfulness, mind-wandering

Procedia PDF Downloads 132
433 Regulatory and Economic Challenges of AI Integration in Cyber Insurance

Authors: Shreyas Kumar, Mili Shangari

Abstract:

Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.

Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware

Procedia PDF Downloads 33
432 Limos Lactobacillus Fermentum from Buffalo Milk Is Suitable for Potential Biotechnological Process Development

Authors: Sergio D’Ambrosioa, Azza Dobousa, Chiara Schiraldia, Donatella Ciminib

Abstract:

Probiotics are living microorganisms that give beneficial effects while consumed. Lactic acid bacteria and bifidobacteria are among the most representative strains assessed as probiotics and exploited as food supplements. Numerous studies demonstrated their potential as a therapeutic candidate for a variety of diseases (restoring gut flora, lowering cholesterol, immune response-enhancing, anti-inflammation and anti-oxidation activities). These beneficial actions are also due to biomolecules produced by probiotics, such as exopolysaccharides (EPSs), that demonstrate plenty of beneficial properties such as antimicrobial, antitumor, anti-biofilm, antiviral and immunomodulatory activities. Limosilactobacillus fermentum is a widely studied member of probiotics; however, few data are available on the development of fermentation and downstream processes for the production of viable biomasses for potential industrial applications. However, few data are available on the development of fermentation processes for the large-scale production of probiotics biomass for industrial applications and for purification processes of EPSs at an industrial scale. For this purpose, L. fermentum strain was isolated from buffalo milk and used as a test example for biotechnological process development. The strain was able to produce up to 109 CFU/mL on a (glucose-based) semi-defined medium deprived of animal-derived raw materials up to the pilot scale (150 L), demonstrating improved results compared to commonly used, although industrially not suitable, media-rich of casein and beef extract. Biomass concentration via microfiltration on hollow fibers, and subsequent spray-drying allowed to recover of about 5.7 × 1010CFU/gpowder of viable cells, indicating strain resistance to harsh processing conditions. Overall, these data demonstrate the possibility of obtaining and maintaining adequate levels of viable L. fermentum cells by using a simple approach that is potentially suitable for industrial development. A downstream EPS purification protocol based on ultrafiltration, precipitation and activated charcoal treatments showed a purity of the recovered polysaccharides of about 70-80%.

Keywords: probiotics, fermentation, exopolysaccharides (EPSs), purification

Procedia PDF Downloads 83
431 Enhancing Academic and Social Skills of Elementary School Students with Autism Spectrum Disorder by an Intensive and Comprehensive Teaching Program

Authors: Piyawan Srisuruk, Janya Boonmeeprasert, Romwarin Gamlunglert, Benjamaporn Choikhruea, Ornjira Jaraepram, Jarin Boonsuchat, Sakdadech Singkibud, Kusalaporn Chaiudomsom, Chanatiporn Chonprai, Pornchanaka Tana, Suchat Paholpak

Abstract:

Objective: To develop an Intensive and comprehensive program (ICP) for the Inclusive Class Teacher (ICPICT) to teach elementary students (ES) with ASD in order to enhance the students’ academic and social skills (ASS) and to study the effect of the teaching program. Methods: The purposive sample included 15 Khon Kaen inclusive class teachers and their 15 elementary students. All the students were diagnosed by a child and adolescent psychiatrist to have DSM-5 level 1 ASD. The study tools included 1) an ICP to teach teachers about ASD, a teaching method to enhance academic and social skills for ES with ASD, and an assessment tool to assess the teacher’s knowledge before and after the ICP. 2) an ICPICT to teach ES with ASD to enhance their ASS. The project taught 10 sessions, 3 hours each. The ICPICT had its teaching structure. Teaching media included: pictures, storytelling, songs, and plays. The authors taught and demonstrated to the participant teachers how to teach with the ICPICT until the participants could display the correct teaching method. Then the teachers taught ICPICT at school by themselves 3) an assessment tool to assess the students’ ASS before and after the completion of the study. The ICP to teach the teachers, the ICPICT, and the relevant assessment tools were developed by the authors and were adjusted until consensus agreed as appropriate for researching by 3 curriculum of teaching children with ASD experts. The data were analyzed by descriptive and analytic statistics via SPSS version 26. Results: After the briefing, the teachers increased the mean score, though not with statistical significance, of knowledge of ASD and how to teach ES with ASD on ASS (p = 0.13). Teaching ES with ASD with the ICPICT could increase the mean scores of the students’ skills in learning and expressing social emotions, relationships with a friend, transitioning, and skills in academic function 3.33, 2.27, 2.94, and 3.00 scores (full scores were 18, 12, 15 and 12, Paired T-Test p = 0.007, 0.013, 0.028 and 0.003 respectively). Conclusion: The program to teach academic and social skills simultaneously in an intensive and comprehensive structure could enhance both the academic and social skills of elementary students with ASD. Keywords: Elementary students, autism spectrum, academic skill, social skills, intensive program, comprehensive program, integration.

Keywords: academica and social skills, students with autism, intensive and comprehensive, teaching program

Procedia PDF Downloads 64
430 Transcriptomic Analysis for Differential Expression of Genes Involved in Secondary Metabolite Production in Narcissus Bulb and in vitro Callus

Authors: Aleya Ferdausi, Meriel Jones, Anthony Halls

Abstract:

The Amaryllidaceae genus Narcissus contains secondary metabolites, which are important sources of bioactive compounds such as pharmaceuticals indicating that their biological activity extends from the native plant to humans. Transcriptome analysis (RNA-seq) is an effective platform for the identification and functional characterization of candidate genes as well as to identify genes encoding uncharacterized enzymes. The biotechnological production of secondary metabolites in plant cell or organ cultures has become a tempting alternative to the extraction of whole plant material. The biochemical pathways for the production of secondary metabolites require primary metabolites to undergo a series of modifications catalyzed by enzymes such as cytochrome P450s, methyltransferases, glycosyltransferases, and acyltransferases. Differential gene expression analysis of Narcissus was obtained from two conditions, i.e. field and in vitro callus. Callus was obtained from modified MS (Murashige and Skoog) media supplemented with growth regulators and twin-scale explants from Narcissus cv. Carlton bulb. A total of 2153 differentially expressed transcripts were detected in Narcissus bulb and in vitro callus, and 78.95% of those were annotated. It showed the expression of genes involved in the biosynthesis of alkaloids were present in both conditions i.e. cytochrome P450s, O-methyltransferase (OMTs), NADP/NADPH dehydrogenases or reductases, SAM-synthetases or decarboxylases, 3-ketoacyl-CoA, acyl-CoA, cinnamoyl-CoA, cinnamate 4-hydroxylase, alcohol dehydrogenase, caffeic acid, N-methyltransferase, and NADPH-cytochrome P450s. However, cytochrome P450s and OMTs involved in the later stage of Amaryllidaceae alkaloids biosynthesis were mainly up-regulated in field samples. Whereas, the enzymes involved in initial biosynthetic pathways i.e. fructose biphosphate adolase, aminotransferases, dehydrogenases, hydroxyl methyl glutarate and glutamate synthase leading to the biosynthesis of precursors; tyrosine, phenylalanine and tryptophan for secondary metabolites were up-regulated in callus. The knowledge of probable genes involved in secondary metabolism and their regulation in different tissues will provide insight into the Narcissus plant biology related to alkaloid production.

Keywords: narcissus, callus, transcriptomics, secondary metabolites

Procedia PDF Downloads 143