Search results for: repository
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 148

Search results for: repository

58 Decision Making System for Clinical Datasets

Authors: P. Bharathiraja

Abstract:

Computer Aided decision making system is used to enhance diagnosis and prognosis of diseases and also to assist clinicians and junior doctors in clinical decision making. Medical Data used for decision making should be definite and consistent. Data Mining and soft computing techniques are used for cleaning the data and for incorporating human reasoning in decision making systems. Fuzzy rule based inference technique can be used for classification in order to incorporate human reasoning in the decision making process. In this work, missing values are imputed using the mean or mode of the attribute. The data are normalized using min-ma normalization to improve the design and efficiency of the fuzzy inference system. The fuzzy inference system is used to handle the uncertainties that exist in the medical data. Equal-width-partitioning is used to partition the attribute values into appropriate fuzzy intervals. Fuzzy rules are generated using Class Based Associative rule mining algorithm. The system is trained and tested using heart disease data set from the University of California at Irvine (UCI) Machine Learning Repository. The data was split using a hold out approach into training and testing data. From the experimental results it can be inferred that classification using fuzzy inference system performs better than trivial IF-THEN rule based classification approaches. Furthermore it is observed that the use of fuzzy logic and fuzzy inference mechanism handles uncertainty and also resembles human decision making. The system can be used in the absence of a clinical expert to assist junior doctors and clinicians in clinical decision making.

Keywords: decision making, data mining, normalization, fuzzy rule, classification

Procedia PDF Downloads 487
57 The Role of the University of Zululand in Documenting and Disseminating Indigenous Knowledge, in KwaZulu-Natal, South Africa

Authors: Smiso Buthelezi, Petros Dlamini, Dennis Ocholla

Abstract:

The study assesses the University of Zululand's practices for documenting, sharing, and accessing indigenous knowledge. Two research objectives guided it: to determine how indigenous knowledge (IK) is developed at the University of Zululand and how indigenous knowledge (IK) is documented at the University of Zululand. The study adopted both interpretive and positivist research paradigms. Ultimately, qualitative and quantitative research methods were used. The qualitative research approach collected data from academic and non-academic staff members. Interviews were conducted with 18 academic staff members and 5 with support staff members. The quantitative research approach was used to collect data from indigenous knowledge (IK) theses and dissertations from the University of Zululand Institutional Repository between 2009-2019. The study results revealed that many departments across the University of Zululand were involved in creating indigenous knowledge (IK)-related content. The department of African Languages was noted to be more involved in creating IK-related content. Moreover, the documentation of the content related to indigenous knowledge (IK) at the University of Zululand is done frequently but is not readily known. It was found that the creation and documentation of indigenous knowledge by different departments faced several challenges. The common challenges are a lack of interest among indigenous knowledge (IK) owners in sharing their knowledge, the local language as a barrier, and a shortage of proper tools for recording and capturing indigenous knowledge (IK). One of the study recommendations is the need for an indigenous knowledge systems (IKS) policy to be in place at the University of Zululand.

Keywords: knowledge creation, SECI model, information and communication technology., indigenous knowledge

Procedia PDF Downloads 78
56 Social Imagination and History Teaching: Critical Thinking's Possibilities in the Australian Curriculum

Authors: Howard Prosser

Abstract:

This paper examines how critical thinking is framed, especially for primary-school students, in the recently established Australian Curriculum: History. Critical thinking is one of the curriculum’s 'general capabilities.' History provides numerous opportunities for critical thinking’s application in everyday life. The so-called 'history wars' that took place just prior to the curriculum’s introduction in 2014 sought to bring to light the limits of a singular historical narrative and reveal that which had been repressed. Consequently, the Australian history curriculum reflects this shifting mindset. Teachers are presented with opportunities to treat history in the classroom as a repository of social possibility, especially related to democratic potential, beyond hackneyed and jingoistic tales of Australian nationhood. Yet such opportunities are not explicit within the document and are up against pre-existing pedagogic practices. Drawing on political thinker Cornelius Castoriadis’s rendering of the 'social-historical' and 'paidea,' as well as his mobilisation of psychoanalysis, the study outlines how the curriculum’s critical-thinking component opens up possibilities for students and teachers to revise assumptions about how history is understood. This ontological shift is ultimately creative: the teachers’ imaginations connect the students’ imaginations, and vice versa, to the analysis that is at the heart of historical thinking. The implications of this social imagination add to the current discussions about historical consciousness among scholars like Peter Seixas. But, importantly, it has practical application in the primary-school classroom where history becomes creative acts, like play, that is indeterminate and social rather than fixed and individual.

Keywords: Australia, Castoriadis, critical thinking, history, imagination

Procedia PDF Downloads 281
55 Multiscale Entropy Analysis of Electroencephalogram (EEG) of Alcoholic and Control Subjects

Authors: Lal Hussain, Wajid Aziz, Imtiaz Ahmed Awan, Sharjeel Saeed

Abstract:

Multiscale entropy analysis (MSE) is a useful technique recently developed to quantify the dynamics of physiological signals at different time scales. This study is aimed at investigating the electroencephalogram (EEG) signals to analyze the background activity of alcoholic and control subjects by inspecting various coarse-grained sequences formed at different time scales. EEG recordings of alcoholic and control subjects were taken from the publically available machine learning repository of University of California (UCI) acquired using 64 electrodes. The MSE analysis was performed on the EEG data acquired from all the electrodes of alcoholic and control subjects. Mann-Whitney rank test was used to find significant differences between the groups and result were considered statistically significant for p-values<0.05. The area under receiver operator curve was computed to find the degree separation between the groups. The mean ranks of MSE values at all the times scales for all electrodes were higher control subject as compared to alcoholic subjects. Higher mean ranks represent higher complexity and vice versa. The finding indicated that EEG signals acquired through electrodes C3, C4, F3, F7, F8, O1, O2, P3, T7 showed significant differences between alcoholic and control subjects at time scales 1 to 5. Moreover, all electrodes exhibit significance level at different time scales. Likewise, the highest accuracy and separation was obtained at the central region (C3 and C4), front polar regions (P3, O1, F3, F7, F8 and T8) while other electrodes such asFp1, Fp2, P4 and F4 shows no significant results.

Keywords: electroencephalogram (EEG), multiscale sample entropy (MSE), Mann-Whitney test (MMT), Receiver Operator Curve (ROC), complexity analysis

Procedia PDF Downloads 352
54 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 364
53 The Mashishing Marking Memories Project: A Culture-Centered Approach to Participation

Authors: Nongcebo Ngcobo

Abstract:

This research explores the importance of including a multitude of voices in the cultural heritage narrative, particularly in South Africa. The Mashishing project is an extension of and builds on the existing ‘Biesje Poort project’ which is a rock art project that was funded by the National Heritage Council in 2010 - 2013. Hence, the Mashishing marking memories project applies comparable Biesje Poort project objectives, though in a different geographical area. The wider project objectives are to transfer skills, promote social cohesion and empowerment, and lastly to add to the knowledge base of the Mashishing region and the repository of the local museum in the Lydenburg museum. This study is located in the Mashishing area, in Mpumalanga, South Africa. In this area, there were no present multi-vocal heritage projects. This research assesses the Mashishing marking memories project through the culture-centered approach for communication for social change, which examines the impact that the diverse participants have on the operations of the Mashishing project and also investigates whether the culturally diverse participants facilitates or hinders effective participation within the project. Key findings of this research uncovered the significance of participation and diverse voices in the cultural heritage field. Furthermore, this study highlights how unequal power relations affect effective participation. As a result, this study encourages the importance of bringing the researcher and the participant in a safe space to facilitate mutual learning. It also encourages an exchange of roles, where the researcher shifts from being an authoritarian figure to being in the role of a listener.

Keywords: culture, heritage, participation, social change

Procedia PDF Downloads 94
52 Using Visualization Techniques to Support Common Clinical Tasks in Clinical Documentation

Authors: Jonah Kenei, Elisha Opiyo

Abstract:

Electronic health records, as a repository of patient information, is nowadays the most commonly used technology to record, store and review patient clinical records and perform other clinical tasks. However, the accurate identification and retrieval of relevant information from clinical records is a difficult task due to the unstructured nature of clinical documents, characterized in particular by a lack of clear structure. Therefore, medical practice is facing a challenge thanks to the rapid growth of health information in electronic health records (EHRs), mostly in narrative text form. As a result, it's becoming important to effectively manage the growing amount of data for a single patient. As a result, there is currently a requirement to visualize electronic health records (EHRs) in a way that aids physicians in clinical tasks and medical decision-making. Leveraging text visualization techniques to unstructured clinical narrative texts is a new area of research that aims to provide better information extraction and retrieval to support clinical decision support in scenarios where data generated continues to grow. Clinical datasets in electronic health records (EHR) offer a lot of potential for training accurate statistical models to classify facets of information which can then be used to improve patient care and outcomes. However, in many clinical note datasets, the unstructured nature of clinical texts is a common problem. This paper examines the very issue of getting raw clinical texts and mapping them into meaningful structures that can support healthcare professionals utilizing narrative texts. Our work is the result of a collaborative design process that was aided by empirical data collected through formal usability testing.

Keywords: classification, electronic health records, narrative texts, visualization

Procedia PDF Downloads 91
51 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 40
50 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples

Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes

Abstract:

One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.

Keywords: airport ontology, knowledge management, ontology modeling, reasoning

Procedia PDF Downloads 488
49 Preserving Egypt's Cultural Heritage Amidst Urban Development: A Case Study of the Historic Cairo Cemetery

Authors: Ali Mahfouz

Abstract:

Egypt's cultural heritage and artistic riches find themselves at a complex intersection of preservation and urban development, where they face intricate challenges exacerbated by climate change, pollution, urbanization, and construction activities. In this research, it delves into the multifaceted dynamics involved in conserving Egypt's heritage within urban contexts, spotlighting the historic Cairo cemetery as a poignant and timely case study. The historic Cairo cemetery serves as a repository of priceless cultural assets, housing the final resting places of public figures, artists, historians, politicians, and other luminaries. These graves are adorned with magnificent artworks and rare tombstones, collectively representing an irreplaceable slice of Egypt's history and culture. Yet, the looming threat of demolition to make way for new infrastructure projects underscores the delicate equilibrium that preservation efforts must maintain in the face of urban development pressures. This paper illuminates the collaborative efforts of historians, intellectuals, and civil society organizations who are determined to forestall the destruction of this invaluable cultural heritage. Their initiatives, driven by a shared commitment to documenting and safeguarding the cemetery's treasures, underscore the urgent imperative of protecting Egypt's cultural legacy. Through this case study, It gain insights into how Egypt navigates the challenges of preserving its rich heritage amidst urban expansion and a changing climate, emphasizing the broader importance of heritage conservation in an evolving world.

Keywords: Egypt’s cultural heritage, urban development, historic Cairo cemetery, tombstone artworks, demolition threat, heritage conservation, civil society initiatives

Procedia PDF Downloads 47
48 Empathy and Yoga Philosophy: Both Eastern and Western Concepts

Authors: Jacqueline Jasmine Kumar

Abstract:

This paper seeks to challenge the predominate Western-centric paradigm concerning empathy by conducting an exploration of its presence within both Western and Eastern philosophical traditions. The primary focus of this inquiry is the examination of the Indian yogic tradition, encompassing the four yogas: bhakti (love/devotion), karma (action), jnāna (knowledge), and rāja (psychic control). Through this examination, it is demonstrated that empathy does not exclusively originate from Western philosophical thought. Rather than superimposing the Western conceptualization of empathy onto the tenets of Indian philosophy, this study endeavours to unearth a distinct array of ideas and concepts within the four yogas, which significantly contribute to our comprehension of empathy as a universally relevant phenomenon. To achieve this objective, an innovative approach is adopted, delving into various facets of empathy, including the propositional, affective/intuitive, perspective-taking, and actionable dimensions. This approach intentionally deviates from conventional Western frameworks, shifting the emphasis towards lived morally as opposed to engagement in abstract theoretical discourse. While it is acknowledged that the explicit term “empathy” may not be overly articulated within the yogic tradition, a scrupulous examination reveals the underlying substance and significance of this phenomenon. Throughout this comparative analysis, the paper aims to lay a robust foundation for the discourse of empathy within the contexts of the human experience. By assimilating insights gleaned from the Indian yogic tradition, it contributes to the expansion of our comprehension of empathy, enabling an exploration of its multifaceted dimensions. Ultimately, this scholarly endeavour facilitates the development of a more comprehensive and inclusive perspective on empathy, transcending cultural boundaries and enriching our collective repository of knowledge.

Keywords: Bhakti, Yogic, Jnana, Karma

Procedia PDF Downloads 43
47 High-Risk Gene Variant Profiling Models Ethnic Disparities in Diabetes Vulnerability

Authors: Jianhua Zhang, Weiping Chen, Guanjie Chen, Jason Flannick, Emma Fikse, Glenda Smerin, Yanqin Yang, Yulong Li, John A. Hanover, William F. Simonds

Abstract:

Ethnic disparities in many diseases are well recognized and reflect the consequences of genetic, behavior, and environmental factors. However, direct scientific evidence connecting the ethnic genetic variations and the disease disparities has been elusive, which may have led to the ethnic inequalities in large scale genetic studies. Through the genome-wide analysis of data representing 185,934 subjects, including 14,955 from our own studies of the African America Diabetes Mellitus, we discovered sets of genetic variants either unique to or conserved in all ethnicities. We further developed a quantitative gene function-based high-risk variant index (hrVI) of 20,428 genes to establish profiles that strongly correlate with the subjects' self-identified ethnicities. With respect to the ability to detect human essential and pathogenic genes, the hrVI analysis method is both comparable with and complementary to the well-known genetic analysis methods, pLI and VIRlof. Application of the ethnicity-specific hrVI analysis to the type 2 diabetes mellitus (T2DM) national repository, containing 20,791 cases and 24,440 controls, identified 114 candidate T2DM-associated genes, 8.8-fold greater than that of ethnicity-blind analysis. All the genes identified are defined as either pathogenic or likely-pathogenic in ClinVar database, with 33.3% diabetes-associated and 54.4% obesity-associated genes. These results demonstrate the utility of hrVI analysis and provide the first genetic evidence by clustering patterns of how genetic variations among ethnicities may impede the discovery of diabetes and foreseeably other disease-associated genes.

Keywords: diabetes-associated genes, ethnic health disparities, high-risk variant index, hrVI, T2DM

Procedia PDF Downloads 106
46 Visualization Tool for EEG Signal Segmentation

Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh

Abstract:

This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.

Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation

Procedia PDF Downloads 368
45 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs

Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye

Abstract:

This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.

Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label

Procedia PDF Downloads 86
44 Charter versus District Schools and Student Achievement: Implications for School Leaders

Authors: Kara Rosenblatt, Kevin Badgett, James Eldridge

Abstract:

There is a preponderance of information regarding the overall effectiveness of charter schools and their ability to increase academic achievement compared to traditional district schools. Most research on the topic is focused on comparing long and short-term outcomes, academic achievement in mathematics and reading, and locale (i.e., urban, v. Rural). While the lingering unanswered questions regarding effectiveness continue to loom for school leaders, data on charter schools suggests that enrollment increases by 10% annually and that charter schools educate more than 2 million U.S. students across 40 states each year. Given the increasing share of U.S. students educated in charter schools, it is important to better understand possible differences in student achievement defined in multiple ways for students in charter schools and for those in Independent School District (ISD) settings in the state of Texas. Data were retrieved from the Texas Education Agency’s (TEA) repository that includes data organized annually and available on the TEA website. Specific data points and definitions of achievement were based on characterizations of achievement found in the relevant literature. Specific data points include but were not limited to graduation rate, student performance on standardized testing, and teacher-related factors such as experience and longevity in the district. Initial findings indicate some similarities with the current literature on long-term student achievement in English/Language Arts; however, the findings differ substantially from other recent research related to long-term student achievement in social studies. There are a number of interesting findings also related to differences between achievement for students in charters and ISDs and within different types of charter schools in Texas. In addition to findings, implications for leadership in different settings will be explored.

Keywords: charter schools, ISDs, student achievement, implications for PK-12 school leadership

Procedia PDF Downloads 108
43 Disentangling an Ethnographic Study of the Imagined Inca: How the Yale-Peruvian Expedition of 1911 Created an Inca Heritage

Authors: Charlotte Williams

Abstract:

Yale University Professor Hiram Bingham’s discovery of Machu Picchu in 1911 spurred an international interest in the Inca Empire, and with it, a dispute with the Peruvian government over who had rightful jurisdiction and curatorship over Inca history. By 2011, the Peruvian government initiated a legal battle for the return of artifacts that Bingham had removed from Machu Picchu, successfully returning them not to the site of Machu Picchu, but to Cusco, employing the rationale that the ancient Inca capital housed descendants of the Inca empire. This conflation of the past and present can be traced to a largely unanalyzed study that accompanied Bingham’s expedition: an ethnographic analysis of Inca descendants, which at the time portrayed indigenous Peruvian Andean peoples as remnants of a lost civilization, using Cusco as an assumed repository for people with 'Inca' characteristics. This study draws from the original Yale Peruvian Expedition archives, the Cusco Library archives, and in-depth interviews with curators of the Inca Museum and Machu Picchu Museum to analyze both the political conflict that emerged as a reaction to the ethnographic study, and how the study articulated with an inflating tourism market attempting to define what it meant to be Inca to an international public. The construction of the modern Inca as both directors of tourism management and purveyors of their archaeological material culture points to a unique case in which modern Peruvian citizens could claim heritage to an Inca past despite a lack of recognition as a legally defined group. The result has far-reaching implications, since Bingham’s artifacts returned not necessarily to a traditional nation-state, but to an imagined one, broadening the conditions under which informal repatriations can occur.

Keywords: archaeology of memory, imagined communities, Incanismo, repatriation

Procedia PDF Downloads 142
42 Development of Scenarios for Sustainable Next Generation Nuclear System

Authors: Muhammad Minhaj Khan, Jaemin Lee, Suhong Lee, Jinyoung Chung, Johoo Whang

Abstract:

The Republic of Korea has been facing strong storage crisis from nuclear waste generation as At Reactor (AR) temporary storage sites are about to reach saturation. Since the country is densely populated with a rate of 491.78 persons per square kilometer, Construction of High-level waste repository will not be a feasible option. In order to tackle the storage waste generation problem which is increasing at a rate of 350 tHM/Yr. and 380 tHM/Yr. in case of 20 PWRs and 4 PHWRs respectively, the study strongly focuses on the advancement of current nuclear power plants to GEN-IV sustainable and ecological nuclear systems by burning TRUs (Pu, MAs). First, Calculations has made to estimate the generation of SNF including Pu and MA from PWR and PHWR NPPS by using the IAEA code Nuclear Fuel Cycle Simulation System (NFCSS) for the period of 2016, 2030 (including the saturation period of each site from 2024~2028), 2089 and 2109 as the number of NPPS will increase due to high import cost of non-nuclear energy sources. 2ndly, in order to produce environmentally sustainable nuclear energy systems, 4 scenarios to burnout the Plutonium and MAs are analyzed with the concentration on burning of MA only, MA and Pu together by utilizing SFR, LFR and KALIMER-600 burner reactor after recycling the spent oxide fuel from PWR through pyro processing technology developed by Korea Atomic Energy Research Institute (KAERI) which shows promising and sustainable future benefits by minimizing the HLW generation with regard to waste amount, decay heat, and activity. Finally, With the concentration on front and back end fuel cycles for open and closed fuel cycles of PWR and Pyro-SFR respectively, an overall assessment has been made which evaluates the quantitative as well as economical combativeness of SFR metallic fuel against PWR once through nuclear fuel cycle.

Keywords: GEN IV nuclear fuel cycle, nuclear waste, waste sustainability, transmutation

Procedia PDF Downloads 328
41 Ethnomedicinal Plants Used for Gastrointestinal Ailments by the People of Tribal District Kinnaur (Himachal Pradesh) India

Authors: Geeta, Richa, M. L. Sharma

Abstract:

Himachal Pradesh, a hilly State of India located in the Western Himalayas, with varied altitudinal gradients and climatic conditions, is a repository of plant diversity and the traditional knowledge associated with plants. The State is inhabited by various tribal communities who usually depend upon local plants for curing various ailments. Utilization of plant resources in their day-to-day life has been an age old practice of the people inhabiting this State. The present study pertains to the tribal district Kinnaur of Himachal Pradesh, located between 77°45’ and 79°00’35” east longitudes and between 31°05’50” and 32°05’15” north altitudes. Being a remote area with only very basic medical facilities, local people mostly use traditional herbal medicines for primary healthcare needs. Traditional healers called “Amji” are usually very secretive in revealing their medicinal knowledge to novice and pass on their knowledge to next generation orally. As a result, no written records of healing herbs are available. The aim of present study was to collect and consolidate the ethno-medicinal knowledge of local people of the district about the use of plants for treating gastrointestinal ailments. The ethnobotanical information was collected from the local practitioners, herbal healers and elderly people having rich knowledge about the medicinal herbs through semi-structured questionnaire and key informant discussions. A total 46 plant species belonging to 40 genera and 24 families have been identified which are used as cure for gastrointestinal ailments. Among the parts used for gastointestinal ailments, aerial parts (14%) were followed by the whole plant (13%), root (8%), leaves (6%), flower (5%), fruit and seed (3%) and tuber (1%). These plant species could be prioritized for conservation and subject to further studies related to phytochemical screening for their authenticity. Most of the medicinal plants of the region are collected from the wild and are often harvested for trade. Sustainable harvesting and domestication of the highly traded species from the study area is needed.

Keywords: Amji, gastrointestinal, Kinnaur, medicinal plants, traditional knowledge

Procedia PDF Downloads 367
40 The Traditional Roles and Place of Indigenous Musical Practices in Contemporary African Society

Authors: Benjamin Obeghare Izu

Abstract:

In Africa, indigenous musical practices are the focal point in which most cultural practices revolve, and they are the conduit mainly used in transmitting Indigenous knowledge and values. They serve as a means of documenting, preserving, transmitting indigenous knowledge, and re-enacting their historical, social, and cultural affinity. Indigenous musical practices also serve as a repository for indigenous knowledge and artistic traditions. However, these indigenous musical practices and the resulting cultural ideals are confronted with substantial challenges in the twenty-first century from contemporary cultural influence. Additionally, indigenous musical practices' educational and cultural purposes have been impacted by the broad monetisation of the arts in contemporary society. They are seen as objects of entertainment. Some young people are today unaware of their cultural roots and are losing their cultural identity due to these influences and challenges. In order to help policymakers raise awareness of and encourage the use of indigenous knowledge and musical practices among African youth and scholars, this study is in response to the need to explore the components and functions of the indigenous knowledge system, values, and musical tradition in Africa. The study employed qualitative research methods, utilising interviews, participant observation, and conducting related literature as data collection methods. It examines the indigenous musical practices in the Oba of Benin Royal Igue festival among the Benin people in Edo state, Nigeria, and the Ovwuwve festival observed by the Abraka people in Delta state, Nigeria. The extent to which the indigenous musical practices convey and protect indigenous knowledge and cultural values are reflected in the musical practices of the cultural festivals. The study looks at how indigenous musical arts are related to one another and how that affects how indigenous knowledge is transmitted and preserved. It makes recommendations for how to increase the use of indigenous knowledge and values and their fusion with contemporary culture. The study contributes significantly to ethnomusicology by showing how African traditional music traditions support other facets of culture and how indigenous knowledge might be helpful in contemporary society.

Keywords: African musical practices, African music and dance, African society, indigenous musical practices

Procedia PDF Downloads 84
39 Predicting Radioactive Waste Glass Viscosity, Density and Dissolution with Machine Learning

Authors: Joseph Lillington, Tom Gout, Mike Harrison, Ian Farnan

Abstract:

The vitrification of high-level nuclear waste within borosilicate glass and its incorporation within a multi-barrier repository deep underground is widely accepted as the preferred disposal method. However, for this to happen, any safety case will require validation that the initially localized radionuclides will not be considerably released into the near/far-field. Therefore, accurate mechanistic models are necessary to predict glass dissolution, and these should be robust to a variety of incorporated waste species and leaching test conditions, particularly given substantial variations across international waste-streams. Here, machine learning is used to predict glass material properties (viscosity, density) and glass leaching model parameters from large-scale industrial data. A variety of different machine learning algorithms have been compared to assess performance. Density was predicted solely from composition, whereas viscosity additionally considered temperature. To predict suitable glass leaching model parameters, a large simulated dataset was created by coupling MATLAB and the chemical reactive-transport code HYTEC, considering the state-of-the-art GRAAL model (glass reactivity in allowance of the alteration layer). The trained models were then subsequently applied to the large-scale industrial, experimental data to identify potentially appropriate model parameters. Results indicate that ensemble methods can accurately predict viscosity as a function of temperature and composition across all three industrial datasets. Glass density prediction shows reliable learning performance with predictions primarily being within the experimental uncertainty of the test data. Furthermore, machine learning can predict glass dissolution model parameters behavior, demonstrating potential value in GRAAL model development and in assessing suitable model parameters for large-scale industrial glass dissolution data.

Keywords: machine learning, predictive modelling, pattern recognition, radioactive waste glass

Procedia PDF Downloads 89
38 Magnitude and Factors of Risky Sexual Practice among Day Laborers in Ethiopia: A Systematic Review and Meta-Analysis, 2023

Authors: Kalkidan Worku, Eniyew Tegegne, Menichil Amsalu, Samuel Derbie Habtegiorgis

Abstract:

Introduction: Because of the seasonal nature of the work, day laborers are exposed to risky sexual practices. Since the majority of them are living far away from their birthplace and family, they engage in unplanned and multiple sexual practices. These unplanned and unprotected sexual experiences are a risk for different types of sexual-related health crises. This study aimed to assess the pooled prevalence of risky sexual practices and its determinants among day laborers in Ethiopia. Methods: Online databases, including PubMed, Google Scholar, Science Direct, African Journal of Online, Academia Edu, Semantic Scholar, and university repository sites, were searched from database inception until March 2023. PRISMA 2020 guideline was used to conduct the review. Among 851 extracted studies, ten articles were retained for the final quantitative analysis. To identify the source of heterogeneity, a sub-group analysis and I² test were performed. Publication bias was assessed by using a funnel plot and the Egger and Beg test. The pooled prevalence of risky sexual practices was calculated. Besides, the association between determinant factors and risky sexual practice was determined using a pooled odds ratio (OR) with a 95% confidence interval. Result: The pooled prevalence of risky sexual practices among day laborers was 46.00% (95% CI: 32.96, 59.03). Being single (OR: 2.49; 95% CI: 1.29 to 4.83), substance use (OR: 1.79; 95% CI: 1.40 to 2.29), alcohol intake (OR: 4.19; 95% CI: 2.19 to 8.04), watching pornographic (OR: 5.49; 95% CI: 2.99 to 10.09), discussion about SRH (OR: 4.21; 95% CI: 1.34 to 13.21), visiting night clubs (OR: 2.86 95% CI: 1.79 to 4.57) and risk perception (OR: 0.37 95% CI: 0.20 to 0.70) were the possible factors for risky sexual practice of day laborers in Ethiopia. Conclusions: A large proportion of day laborers engaged in risky sexual practices. Interventions targeting creating awareness of sexual and reproductive health for day laborers should be implemented. Continuous peer education on sexual health should be given to day laborers. Sexual and reproductive health services should be accessible in their workplaces to maximize condom utilization and to facilitate sexual health education for all day laborers.

Keywords: day laborers, sexual health, risky sexual practice, unsafe sex, multiple sexual partners

Procedia PDF Downloads 45
37 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 93
36 Gene Expression Signature-Based Chemical Genomic to Identify Potential Therapeutic Compounds for Colorectal Cancer

Authors: Yen-Hao Su, Wan-Chun Tang, Ya-Wen Cheng, Peik Sia, Chi-Chen Huang, Yi-Chao Lee, Hsin-Yi Jiang, Ming-Heng Wu, I-Lu Lai, Jun-Wei Lee, Kuen-Haur Lee

Abstract:

There is a wide range of drugs and combinations under investigation and/or approved over the last decade to treat colorectal cancer (CRC), but the 5-year survival rate remains poor at stages II–IV. Therefore, new, more efficient drugs still need to be developed that will hopefully be included in first-line therapy or overcome resistance when it appears, as part of second- or third-line treatments in the near future. In this study, we revealed that heat shock protein 90 (Hsp90) inhibitors have high therapeutic potential in CRC according to combinative analysis of NCBI's Gene Expression Omnibus (GEO) repository and chemical genomic database of Connectivity Map (CMap). We found that second generation Hsp90 inhibitor, NVP-AUY922, significantly down regulated the activities of a broad spectrum of kinases involved in regulating cell growth arrest and death of NVPAUY922-sensitive CRC cells. To overcome NVP-AUY922-induced upregulation of survivin expression which causes drug insensitivity, we found that combining berberine (BBR), a herbal medicine with potency in inhibiting survivin expression, with NVP-AUY922 resulted in synergistic antiproliferative effects for NVP-AUY922-sensitive and -insensitive CRC cells. Furthermore, we demonstrated that treatment of NVP-AUY922-insensitive CRC cells with the combination of NVP-AUY922 and BBR caused cell growth arrest through inhibiting CDK4 expression and induction of microRNA-296-5p (miR-296-5p)-mediated suppression of Pin1–β-catenin–cyclin D1 signaling pathway. Finally, we found that the expression level of Hsp90 in tumor tissues of CRC was positively correlated with CDK4 and Pin1 expression levels. Taken together, these results indicate that combination of NVP-AUY922 and BBR therapy can inhibit multiple oncogenic signaling pathways of CRC.

Keywords: berberine, colorectal cancer, connectivity map, heat shock protein 90 inhibitor

Procedia PDF Downloads 286
35 A Methodological Approach to Digital Engineering Adoption and Implementation for Organizations

Authors: Sadia H. Syeda, Zain H. Malik

Abstract:

As systems continue to become more complex and the interdependencies of processes and sub-systems continue to grow and transform, the need for a comprehensive method of tracking and linking the lifecycle of the systems in a digital form becomes ever more critical. Digital Engineering (DE) provides an approach to managing an authoritative data source that links, tracks, and updates system data as it evolves and grows throughout the system development lifecycle. DE enables the developing, tracking, and sharing system data, models, and other related artifacts in a digital environment accessible to all necessary stakeholders. The DE environment provides an integrated electronic repository that enables traceability between design, engineering, and sustainment artifacts. The DE activities' primary objective is to develop a set of integrated, coherent, and consistent system models for the program. It is envisioned to provide a collaborative information-sharing environment for various stakeholders, including operational users, acquisition personnel, engineering personnel, and logistics and sustainment personnel. Examining the processes that DE can support in the systems engineering life cycle (SELC) is a primary step in the DE adoption and implementation journey. Through an analysis of the U.S Department of Defense’s (DoD) Office of the Secretary of Defense (OSD’s) Digital Engineering Strategy and their implementation, examples of DE implementation by the industry and technical organizations, this paper will provide descriptions of the current DE processes and best practices of implementing DE across an enterprise. This will help identify the capabilities, environment, and infrastructure needed to develop a potential roadmap for implementing DE practices consistent with its business strategy. A capability maturity matrix will be provided to assess the organization’s DE maturity emphasizing how all the SELC elements interlink to form a cohesive ecosystem. If implemented, DE can increase efficiency and improve the systems engineering processes' quality and outcomes.

Keywords: digital engineering, digital environment, digital maturity model, single source of truth, systems engineering life-cycle

Procedia PDF Downloads 68
34 Neonatal Mortality, Infant Mortality, and Under-five Mortality Rates in the Provinces of Zimbabwe: A Geostatistical and Spatial Analysis of Public Health Policy Provisions

Authors: Jevonte Abioye, Dylan Savary

Abstract:

The aim of this research is to present a disaggregated geostatistical analysis of the subnational provincial trends of child mortality variation in Zimbabwe from a child health policy perspective. Soon after gaining independence in 1980, the government embarked on efforts towards promoting equitable health care, namely through the provision of primary health care. Government intervention programmes brought hope and promise, but achieving equity in primary health care coverage was hindered by previous existing disparities in maternal health care disproportionately concentrated in urban settings to the detriment of rural communities. The article highlights policies and programs adopted by the government during the millennium development goals period between 1990-2015 as a response to the inequities that characterised the country’s maternal health care. A longitudinal comparative method for a spatial variation on child mortality rates across provinces is developed based on geostatistical analysis. Cross-sectional and time-series data was extracted from the World Health Organisation (WHO) global health observatory data repository, demographic health survey reports, and previous academic and technical publications. Results suggest that although health care policy was uniform across provinces, not all provinces received the same antenatal and perinatal services. Accordingly, provincial rates of child mortality growth between 1994 and 2015 varied significantly. Evidence on the trends of child mortality rates and maternal health policies in Zimbabwe can be valuable for public child health policy planning and public service delivery design both in Zimbabwe and across developing countries pursuing the sustainable development agenda.

Keywords: antenatal care, perinatal care, infant mortality rate, neonatal mortality rate, under-five mortality rate, millennium development goals, sustainable development agenda

Procedia PDF Downloads 179
33 Discerning Divergent Nodes in Social Networks

Authors: Mehran Asadi, Afrand Agah

Abstract:

In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.

Keywords: online social networks, data mining, social cloud computing, interaction and collaboration

Procedia PDF Downloads 118
32 Constructing Digital Memory for Chinese Ancient Village: A Case on Village of Gaoqian

Authors: Linqing Ma, Huiling Feng, Jihong Liang, Yi Qian

Abstract:

In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. Then a repository for the memory of the Village will be completed by doing arrangement and description for those multimedia resources such as texts, photos, videos and so on. Production of Creative products with digital technologies is also possible based a thorough understanding of the culture feature of Gaoqian Village using research tools for literature and history studies and a method of comparative study. Finally, the project will construct an exhibition platform for the Village and its culture by telling its stories with completed structures and treads.

Keywords: ancient villages, digital exhibition, multimedia, traditional culture

Procedia PDF Downloads 556
31 SIPTOX: Spider Toxin Database Information Repository System of Protein Toxins from Spiders by Using MySQL Method

Authors: Iftikhar Tayubi, Tabrej Khan, Rayan Alsulmi, Abdulrahman Labban

Abstract:

Spider produces a special kind of substance. This special kind of substance is called a toxin. The toxin is composed of many types of protein, which differs from species to species. Spider toxin consists of several proteins and non-proteins that include various categories of toxins like myotoxin, neurotoxin, cardiotoxin, dendrotoxin, haemorrhagins, and fibrinolytic enzyme. Protein Sequence information with references of toxins was derived from literature and public databases. From the previous findings, the Spider toxin would be the best choice to treat different types of tumors and cancer. There are many therapeutic regimes, which causes more side effects than treatment hence a different approach must be adopted for the treatment of cancer. The combinations of drugs are being encouraged, and dramatic outcomes are reported. Spider toxin is one of the natural cytotoxic compounds. Hence, it is being used to treat different types of tumors; especially its positive effect on breast cancer is being reported during the last few decades. The efficacy of this database is that it can provide a user-friendly interface for users to retrieve the information about Spiders, toxin and toxin protein of different Spiders species. SPIDTOXD provides a single source information about spider toxins, which will be useful for pharmacologists, neuroscientists, toxicologists, medicinal chemists. The well-ordered and accessible web interface allows users to explore the detail information of Spider and toxin proteins. It includes common name, scientific name, entry id, entry name, protein name and length of the protein sequence. The utility of this database is that it can provide a user-friendly interface for users to retrieve the information about Spider, toxin and toxin protein of different Spider species. The database interfaces will satisfy the demands of the scientific community by providing in-depth knowledge about Spider and its toxin. We have adopted the methodology by using A MySQL and PHP and for designing, we used the Smart Draw. The users can thus navigate from one section to another, depending on the field of interest of the user. This database contains a wealth of information on species, toxins, and clinical data, etc. This database will be useful for the scientific community, basic researchers and those interested in potential pharmaceutical Industry.

Keywords: siptoxd, php, mysql, toxin

Procedia PDF Downloads 148
30 Digital Twin for University Campus: Workflow, Applications and Benefits

Authors: Frederico Fialho Teixeira, Islam Mashaly, Maryam Shafiei, Jurij Karlovsek

Abstract:

The ubiquity of data gathering and smart technologies, advancements in virtual technologies, and the development of the internet of things (IoT) have created urgent demands for the development of frameworks and efficient workflows for data collection, visualisation, and analysis. Digital twin, in different scales of the city into the building, allows for bringing together data from different sources to generate fundamental and illuminating insights for the management of current facilities and the lifecycle of amenities as well as improvement of the performance of current and future designs. Over the past two decades, there has been growing interest in the topic of digital twin and their applications in city and building scales. Most such studies look at the urban environment through a homogeneous or generalist lens and lack specificity in particular characteristics or identities, which define an urban university campus. Bridging this knowledge gap, this paper offers a framework for developing a digital twin for a university campus that, with some modifications, could provide insights for any large-scale digital twin settings like towns and cities. It showcases how currently unused data could be purposefully combined, interpolated and visualised for producing analysis-ready data (such as flood or energy simulations or functional and occupancy maps), highlighting the potential applications of such a framework for campus planning and policymaking. The research integrates campus-level data layers into one spatial information repository and casts light on critical data clusters for the digital twin at the campus level. The paper also seeks to raise insightful and directive questions on how digital twin for campus can be extrapolated to city-scale digital twin. The outcomes of the paper, thus, inform future projects for the development of large-scale digital twin as well as urban and architectural researchers on potential applications of digital twin in future design, management, and sustainable planning, to predict problems, calculate risks, decrease management costs, and improve performance.

Keywords: digital twin, smart campus, framework, data collection, point cloud

Procedia PDF Downloads 43
29 Little Girls and Big Stories: A Thematic Analysis of Gender Representations in Selected Asian Room to Read Storybooks

Authors: Cheeno Marlo Sayuno

Abstract:

Room to Read is an international nonprofit organization aimed at empowering young readers through literature and literacy education. In particular, the organization is focused on girls’ education in schools and bettering their social status through crafting stories and making sure that these stories are accessible to them. In 2019, Room to Read visited the Philippines and partnered with Philippine children’s literature publishers Adarna House, Lampara Books, Anvil Publishing, and OMF-Hiyas with the goal of producing contextualized stories that Filipino children can read. The result is a set of 20 storybooks developed by Filipino writers and illustrators, the author of this paper included. The project led to narratives of experiences in storybook production from conceptualization to publication, towards translations and reimagining in online repository, storytelling, and audiobook formats. During the production process, we were particularly reminded of gender representations, child’s rights, and telling stories that can empower the children in vulnerable communities, who are the beneficiaries of the project. The storybooks, along with many others produced in Asia and the world, are available online through the literacycloud.org website of Room to Read. In this study, the goal is to survey the stories produced in Asia and look at how gender is represented in the storybooks. By analyzing both the texts and the illustrations of the storybooks produced across Asian countries, themes of portrayals of young boys and girls, their characteristics and narratives, and how they are empowered in the stories are identified, with the goal of mapping how Room to Read is able to address the problem of access to literacy among young girls and ensuring them that they can do anything, the way they are portrayed in the stories. The paper hopes to determine how gender is represented in Asian storybooks produced by the international nonprofit organization Room to Read. Thematic textual analysis was used as methodology, where the storybooks are analyzed qualitatively to identify arising themes of gender representation. This study will shed light on the importance of responsible portrayal of gender in storybooks and how it can impact and empower children. The results of the study can also aid writers and illustrators in developing gender-sensitive storybooks.

Keywords: room to read, asian storybooks, young girls, thematic analysis, child empowerment, literacy, education

Procedia PDF Downloads 55