Search results for: data access
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27177

Search results for: data access

22947 Proposal Method of Prediction of the Early Stages of Dementia Using IoT and Magnet Sensors

Authors: João Filipe Papel, Tatsuji Munaka

Abstract:

With society's aging and the number of elderly with dementia rising, researchers have been actively studying how to support the elderly in the early stages of dementia with the objective of allowing them to have a better life quality and as much as possible independence. To make this possible, most researchers in this field are using the Internet Of Things to monitor the elderly activities and assist them in performing them. The most common sensor used to monitor the elderly activities is the Camera sensor due to its easy installation and configuration. The other commonly used sensor is the sound sensor. However, we need to consider privacy when using these sensors. This research aims to develop a system capable of predicting the early stages of dementia based on monitoring and controlling the elderly activities of daily living. To make this system possible, some issues need to be addressed. First, the issue related to elderly privacy when trying to detect their Activities of Daily Living. Privacy when performing detection and monitoring Activities of Daily Living it's a serious concern. One of the purposes of this research is to achieve this detection and monitoring without putting the privacy of the elderly at risk. To make this possible, the study focuses on using an approach based on using Magnet Sensors to collect binary data. The second is to use the data collected by monitoring Activities of Daily Living to predict the early stages of Dementia. To make this possible, the research team suggests developing a proprietary ontology combined with both data-driven and knowledge-driven.

Keywords: dementia, activity recognition, magnet sensors, ontology, data driven and knowledge driven, IoT, activities of daily living

Procedia PDF Downloads 108
22946 Overcoming Reading Barriers in an Inclusive Mathematics Classroom with Linguistic and Visual Support

Authors: A. Noll, J. Roth, M. Scholz

Abstract:

The importance of written language in a democratic society is non-controversial. Students with physical, learning, cognitive or developmental disabilities often have difficulties in understanding information which is presented in written language only. These students suffer from obstacles in diverse domains. In order to reduce such barriers in educational as well as in out-of-school areas, access to written information must be facilitated. Readability can be enhanced by linguistic simplifications like the application of easy-to-read language. Easy-to-read language shall help people with disabilities to participate socially and politically in society. The authors state, for example, that only short simple words should be used, whereas the occurrence of complex sentences should be avoided. So far, these guidelines were not empirically proved. Another way to reduce reading barriers is the use of visual support, for example, symbols. A symbol conveys, in contrast to a photo, a single idea or concept. Little empirical data about the use of symbols to foster the readability of texts exist. Nevertheless, a positive influence can be assumed, e.g., because of the multimedia principle. It indicates that people learn better from words and pictures than from words alone. A qualitative Interview and Eye-Tracking-Study, which was conducted by the authors, gives cause for the assumption that besides the illustration of single words, the visualization of complete sentences may be helpful. Thus, the effect of photos, which illustrate the content of complete sentences, is also investigated in this study. This leads us to the main research question which was focused on: Does the use of easy-to-read language and/or enriching text with symbols or photos facilitate pupils’ comprehension of learning tasks? The sample consisted of students with learning difficulties (N = 144) and students without SEN (N = 159). The students worked on the tasks, which dealt with introducing fractions, individually. While experimental group 1 received a linguistically simplified version of the tasks, experimental group 2 worked with a variation which was linguistically simplified and furthermore, the keywords of the tasks were visualized by symbols. Experimental group 3 worked on exercises which were simplified by easy-to-read-language and the content of the whole sentences was illustrated by photos. Experimental group 4 received a not simplified version. The participants’ reading ability and their IQ was elevated beforehand to build four comparable groups. There is a significant effect of the different setting on the students’ results F(3,140) = 2,932; p = 0,036*. A post-hoc-analyses with multiple comparisons shows that this significance results from the difference between experimental group 3 and 4. The students in the group easy-to-read language plus photos worked on the exercises significantly more successfully than the students who worked in the group with no simplifications. Further results which refer, among others, to the influence of the students reading ability will be presented at the ICERI 2018.

Keywords: inclusive education, mathematics education, easy-to-read language, photos, symbols, special educational needs

Procedia PDF Downloads 157
22945 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data

Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone

Abstract:

This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as a ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease data set, the study successfully identified key factors, and the results were consistent with previous studies.

Keywords: lyme disease, Poisson generalized linear model, ridge regression, lasso regression, elastic net regression

Procedia PDF Downloads 144
22944 Graph-Based Semantical Extractive Text Analysis

Authors: Mina Samizadeh

Abstract:

In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.

Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis

Procedia PDF Downloads 76
22943 One of the Missing Pieces of Inclusive Education: Sexual Orientations

Authors: Sıla Uzkul

Abstract:

As a requirement of human rights and children's rights, the basic condition of inclusive education is that it covers all children. However, the reforms made in the context of education in Turkey and around the world include a limited level of inclusiveness. Generally, the inclusiveness mentioned is for individuals who need special education. Educational reforms superficially state that differences are tolerated, but these differences are extremely limited and often do not include sexual orientation. When we look at the education modules of the Ministry of National Education within the scope of inclusive education in Turkey, there are children with special needs, bilingual children, children exposed to violence, children under temporary protection, children affected by migration and terrorism, and children affected by natural disasters. No training modules or inclusion terms regarding sexual orientations could be found. This research aimed to understand the perspectives of research assistants working in the preschool education department regarding sexual orientations within the scope of inclusive education. Six research assistants working in the preschool teaching department at a public university in Ankara (Turkey) participated in this qualitative research study. Participants were determined by typical case sampling, which is one of the purposeful sampling methods. The data of this research was obtained through a "survey consisting of open-ended questions". Raw data from the surveys were analyzed and interpreted using the "content analysis technique" (Yıldırım & Şimşek, 2005). During the data analysis process, the data from the participants were first numbered, then all the data were read, and content analysis was performed, and possible themes, categories, and codes were extracted. The opinions of the participants in the research regarding sexual orientations in inclusive education are presented under three main headings within the scope of the research questions. These are: (a) their views on inclusive education, (b) their views on sexual orientations (c) their views on sexual orientations in the preschool period.

Keywords: sexual orientation, inclusive education, child rights, preschool education

Procedia PDF Downloads 66
22942 Implementation of Digital Technologies in SMEs in Kazakhstan: A Pathway to Sustainable Development

Authors: Toibayeva Shara, Zainolda Fariza, Abylkhassenova Dina, Zholdybaev Baurzhan, Almassov Nurbek, Aldabergenov Ablay

Abstract:

The article explores the opportunities and challenges associated with the adoption of digital technologies and automation in small and medium-sized businesses (SMEs) in Kazakhstan to achieve the Sustainable Development Goals (SDGs). Key aspects such as improving production efficiency, reducing carbon footprint, and resource efficiency are discussed, as well as the challenges faced by companies, including limited access to finance and lack of knowledge about digital solutions. Based on an analysis of existing practices, recommendations are offered to improve digital infrastructure and create an enabling environment for SMEs to increase their competitiveness and adaptability in the face of global change. The introduction of innovative technologies is seen as an important step towards long-term sustainability and successful business development in Kazakhstan. The study was supported by grants from the Ministry of Science and Higher Education of the Republic of Kazakhstan (grant No. AP23488459) ‘Research and development of scientific and methodological foundations of an intelligent system of management of medium and small businesses in Kazakhstan’.

Keywords: small and medium-sized businesses, digitalization, automation, sustainable development, sustainable development goals, innovation, competitiveness

Procedia PDF Downloads 21
22941 Optimizing DWDM Networks with Zero-Touch Provisioning for High-Capacity Data Transmission

Authors: Saqib Warsi

Abstract:

The evolution of optical communication technologies is pivotal in meeting the growing data demand driven by emerging technologies such as 5G, IoT, and upcoming 6G networks. This paper presents advancements in Dense Wavelength Division Multiplexing (DWDM) systems, focusing on the integration of Zero Touch Provisioning (ZTP) for simplified deployment and the ability to scale data transmission over single fiber pairs. The proposed methodology leverages high-capacity DWDM channels capable of supporting data rates exceeding 800G, ensuring future-proof solutions for both residential and enterprise communication infrastructures. Moreover, this paper examines the impact of these technologies on operational efficiency by minimizing the need for manual configuration, leading to reduced costs and faster deployment timelines. We also explore how the integration of optical amplifiers, Optical Line Amplifier (OLA) alternatives, and optical control plane protocols (such as ASON, GMPLS, OpenFlow, and SDN) play a critical role in enhancing the flexibility, scalability, and energy efficiency of optical networks. By focusing on optical solutions, this paper seeks to address the future challenges of reducing fiber pair consumption and improving network performance without compromising on capacity or reliability.

Keywords: zero-touch provisioning (ZTP), dense wavelength division multiplexing (DWDM), optical networks, optical control plane (ASON, GMPLS, OpenFlow, SDN)

Procedia PDF Downloads 9
22940 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques

Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo

Abstract:

Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.

Keywords: air pollution, air quality modelling, data mining, particulate matter

Procedia PDF Downloads 259
22939 In-situ Oxygen Enrichment for Underground Coal Gasification

Authors: Adesola O. Orimoloye, Edward Gobina

Abstract:

Membrane separation technology is still considered as an emerging technology in the mining sector and does not yet have the widespread acceptance that it has in other industrial sectors. Underground Coal Gasification (UCG), wherein coal is converted to gas in-situ, is a safer alternative to mining method that retains all pollutants underground making the process environmentally friendly. In-situ combustion of coal for power generation allows access to more of the physical global coal resource than would be included in current economically recoverable reserve estimates. Where mining is no longer taking place, for economic or geological reasons, controlled gasification permits exploitation of the deposit (again a reaction of coal to form a synthesis gas) of coal seams in situ. The oxygen supply stage is one of the most expensive parts of any gasification project but the use of membranes is a potentially attractive approach for producing oxygen-enriched air. In this study, a variety of cost-effective membrane materials that gives an optimal amount of oxygen concentrations in the range of interest was designed and tested at diverse operating conditions. Oxygen-enriched atmosphere improves the combustion temperature but a decline is observed if oxygen concentration exceeds optimum. Experimental result also reveals the preparatory method, apparatus and performance of the fabricated membrane.

Keywords: membranes, oxygen-enrichment, gasification, coal

Procedia PDF Downloads 467
22938 Family Carers' Experiences in Striving for Medical Care and Finding Their Solutions for Family Members with Mental Illnesses

Authors: Yu-Yu Wang, Shih-Hua Hsieh, Ru-Shian Hsieh

Abstract:

Wishes and choices being respected, and the right to be supported rather than coerced, have been internationally recognized as the human rights of persons with mental illness. In Taiwan, ‘coerced hospitalization’ has become difficult since the revision of the mental health legislation in 2007. Despite trend towards human rights, the real problem families face when their family members are in mental health crisis is the lack of alternative services. This study aims to explore: 1) When is hospitalization seen as the only solution by family members? 2) What are the barriers for arranging hospitalization, and how are they managed? 3) What have family carers learned, in their experiences of caring for their family members with mental illness? To answer these questions, qualitative approach was adopted, and focus group interviews were taken to collect data. This study includes 24 family carers. The main findings of this research include: First, hospital is the last resort for carers in helplessness. Family carers tend to do everything they could to provide care at home for their family members with mental illness. Carers seek hospitalization only when a patient’s behavior is too violent, weird, and/or abnormal, and beyond their ability to manage. Hospitalization, nevertheless, is never an easy choice. Obstacles emanate from the attitudes of the medical doctors, the restricted areas of ambulance service, and insufficient information from the carers’ part. On the other hand, with some professionals’ proactive assistance, access to medical care while in crisis becomes possible. Some family carers obtained help from the medical doctor, nurse, therapist and social workers. Some experienced good help from policemen, taxi drivers, and security guards at the hospital. The difficulty in accessing medical care prompts carers to work harder on assisting their family members with mental illness to stay in stable states. Carers found different ways of helping the ‘person’ to get along with the ‘illness’ and have better quality of life. Taking back ‘the right to control’ in utilizing medication, from passiveness to negotiating with medical doctors and seeking alternative therapies, are seen in many carers’ efforts. Besides, trying to maintain regular activities in daily life and play normal family roles are also experienced as important. Furthermore, talking with the patient as a person is also important. The authors conclude that in order to protect the human rights of persons with mental illness, it is crucial to make the medical care system more flexible and to make the services more humane: sufficient information should be provided and communicated, and efforts should be made to maintain the person’s social roles and to support the family.

Keywords: family carers, independent living, mental health crisis, persons with mental illness

Procedia PDF Downloads 315
22937 Axial Load Capacity of Drilled Shafts from In-Situ Test Data at Semani Site, in Albania

Authors: Neritan Shkodrani, Klearta Rrushi, Anxhela Shaha

Abstract:

Generally, the design of axial load capacity of deep foundations is based on the data provided from field tests, such as SPT (Standard Penetration Test) and CPT (Cone Penetration Test) tests. This paper reports the results of axial load capacity analysis of drilled shafts at a construction site at Semani, in Fier county, Fier prefecture in Albania. In this case, the axial load capacity analyses are based on the data of 416 SPT tests and 12 CPTU tests, which are carried out in this site construction using 12 boreholes (10 borings of a depth 30.0 m and 2 borings of a depth of 80.0m). The considered foundation widths range from 0.5m to 2.5 m and foundation embedment lengths is fixed at a value of 25m. SPT – based analytical methods from the Japanese practice of design (Building Standard Law of Japan) and CPT – based analytical Eslami and Fellenius methods are used for obtaining axial ultimate load capacity of drilled shafts. The considered drilled shaft (25m long and 0.5m - 2.5m in diameter) is analyzed for the soil conditions of each borehole. The values obtained from sets of calculations are shown in different charts. Then the reported axial load capacity values acquired from SPT and CPTU data are compared and some conclusions are found related to the mentioned methods of calculations.

Keywords: deep foundations, drilled shafts, axial load capacity, ultimate load capacity, allowable load capacity, SPT test, CPTU test

Procedia PDF Downloads 109
22936 A Grounded Theory on Marist Spirituality/Charism from the Perspective of the Lay Marists in the Philippines

Authors: Nino M. Pizarro

Abstract:

To the author’s knowledge, despite the written documents about Marist spirituality/charism, nothing has been done concerning a clear theoretical framework that highlights Marist spirituality/charism from the perspective or lived experience of the lay Marists of St. Marcellin Champagnat. The participants of the study are the lay Marist - educators who are from Marist Schools in the Philippines. Since the study would like to find out the respondents’ own concepts and meanings about Marist spirituality/charism, qualitative methodology is considered the approach to be used in the study. In particular, the study will use the qualitative methods of Barney Glaser. The theory will be generated systematically from data collection, coding and analyzing through memoing, theoretical sampling, sorting and writing and using the constant comparative method. The data collection method that will be employed in this grounded theory research is the in-depth interview that is semi-structured and participant driven. Data collection will be done through snowball sampling that is purposive. The study is considering to come up with a theoretical framework that will help the lay Marists to deepen their understanding of the Marist spirituality/charism and their vocation as lay partners of the Marist Brothers of the Schools.

Keywords: grounded theory, Lay Marists, lived experience, Marist spirituality/charism

Procedia PDF Downloads 314
22935 Leakage Current Analysis of FinFET Based 7T SRAM at 32nm Technology

Authors: Chhavi Saxena

Abstract:

FinFETs can be a replacement for bulk-CMOS transistors in many different designs. Its low leakage/standby power property makes FinFETs a desirable option for memory sub-systems. Memory modules are widely used in most digital and computer systems. Leakage power is very important in memory cells since most memory applications access only one or very few memory rows at a given time. As technology scales down, the importance of leakage current and power analysis for memory design is increasing. In this paper, we discover an option for low power interconnect synthesis at the 32nm node and beyond, using Fin-type Field-Effect Transistors (FinFETs) which are a promising substitute for bulk CMOS at the considered gate lengths. We consider a mechanism for improving FinFETs efficiency, called variable supply voltage schemes. In this paper, we’ve illustrated the design and implementation of FinFET based 4x4 SRAM cell array by means of one bit 7T SRAM. FinFET based 7T SRAM has been designed and analysis have been carried out for leakage current, dynamic power and delay. For the validation of our design approach, the output of FinFET SRAM array have been compared with standard CMOS SRAM and significant improvements are obtained in proposed model.

Keywords: FinFET, 7T SRAM cell, leakage current, delay

Procedia PDF Downloads 459
22934 Bayesian Structural Identification with Systematic Uncertainty Using Multiple Responses

Authors: André Jesus, Yanjie Zhu, Irwanda Laory

Abstract:

Structural health monitoring is one of the most promising technologies concerning aversion of structural risk and economic savings. Analysts often have to deal with a considerable variety of uncertainties that arise during a monitoring process. Namely the widespread application of numerical models (model-based) is accompanied by a widespread concern about quantifying the uncertainties prevailing in their use. Some of these uncertainties are related with the deterministic nature of the model (code uncertainty) others with the variability of its inputs (parameter uncertainty) and the discrepancy between a model/experiment (systematic uncertainty). The actual process always exhibits a random behaviour (observation error) even when conditions are set identically (residual variation). Bayesian inference assumes that parameters of a model are random variables with an associated PDF, which can be inferred from experimental data. However in many Bayesian methods the determination of systematic uncertainty can be problematic. In this work systematic uncertainty is associated with a discrepancy function. The numerical model and discrepancy function are approximated by Gaussian processes (surrogate model). Finally, to avoid the computational burden of a fully Bayesian approach the parameters that characterise the Gaussian processes were estimated in a four stage process (modular Bayesian approach). The proposed methodology has been successfully applied on fields such as geoscience, biomedics, particle physics but never on the SHM context. This approach considerably reduces the computational burden; although the extent of the considered uncertainties is lower (second order effects are neglected). To successfully identify the considered uncertainties this formulation was extended to consider multiple responses. The efficiency of the algorithm has been tested on a small scale aluminium bridge structure, subjected to a thermal expansion due to infrared heaters. Comparison of its performance with responses measured at different points of the structure and associated degrees of identifiability is also carried out. A numerical FEM model of the structure was developed and the stiffness from its supports is considered as a parameter to calibrate. Results show that the modular Bayesian approach performed best when responses of the same type had the lowest spatial correlation. Based on previous literature, using different types of responses (strain, acceleration, and displacement) should also improve the identifiability problem. Uncertainties due to parametric variability, observation error, residual variability, code variability and systematic uncertainty were all recovered. For this example the algorithm performance was stable and considerably quicker than Bayesian methods that account for the full extent of uncertainties. Future research with real-life examples is required to fully access the advantages and limitations of the proposed methodology.

Keywords: bayesian, calibration, numerical model, system identification, systematic uncertainty, Gaussian process

Procedia PDF Downloads 330
22933 Developing and Validating an Instrument for Measuring Mobile Government Adoption in Saudi Arabia

Authors: Sultan Alotaibi, Dmitri Roussinov

Abstract:

Many governments recently started to change the ways of providing their services by allowing their citizens to access services from anywhere without the need of visiting the location of the service provider. Mobile government (M-government) is one of the techniques that fulfill that goal. It has been adopted by many governments. M-government can be defined as an implementation of Electronic Government (E-Government) by using mobile technology with the aim of improving service delivery to citizens, businesses and all government agencies. There have been several research projects developing models to understand the behavior of individuals towards the adoption of m-government. This paper proposes a model for adoption of m-government services in Saudi Arabia by extending Technology Acceptance Model (TAM) by introducing external factors. This paper also reports on the development of a survey instrument designed to measure user perception of mobile government acceptance. A survey instrument has been developed by using existing scales from prior instruments and a pilot study has been conducted by distributing the survey to 33 participants. As a result, a survey instrument has been refined to retain 43 items. The results also showed that the reliabilities of all the scales in the survey instrument are above the levels acceptable in current academic research, thus the instruments developed by us are capable of analyzing the factors in M-government adoption.

Keywords: TAM, m-government, e-government, model, acceptance, mobile government

Procedia PDF Downloads 253
22932 Annexing the Strength of Information and Communication Technology (ICT) for Real-time TB Reporting Using TB Situation Room (TSR) in Nigeria: Kano State Experience

Authors: Ibrahim Umar, Ashiru Rajab, Sumayya Chindo, Emmanuel Olashore

Abstract:

INTRODUCTION: Kano is the most populous state in Nigeria and one of the two states with the highest TB burden in the country. The state notifies an average of 8,000+ TB cases quarterly and has the highest yearly notification of all the states in Nigeria from 2020 to 2022. The contribution of the state TB program to the National TB notification varies from 9% to 10% quarterly between the first quarter of 2022 and second quarter of 2023. The Kano State TB Situation Room is an innovative platform for timely data collection, collation and analysis for informed decision in health system. During the 2023 second National TB Testing week (NTBTW) Kano TB program aimed at early TB detection, prevention and treatment. The state TB Situation room provided avenue to the state for coordination and surveillance through real time data reporting, review, analysis and use during the NTBTW. OBJECTIVES: To assess the role of innovative information and communication technology platform for real-time TB reporting during second National TB Testing week in Nigeria 2023. To showcase the NTBTW data cascade analysis using TSR as innovative ICT platform. METHODOLOGY: The State TB deployed a real-time virtual dashboard for NTBTW reporting, analysis and feedback. A data room team was set up who received realtime data using google link. Data received was analyzed using power BI analytic tool with statistical alpha level of significance of <0.05. RESULTS: At the end of the week-long activity and using the real-time dashboard with onsite mentorship of the field workers, the state TB program was able to screen a total of 52,054 people were screened for TB from 72,112 individuals eligible for screening (72% screening rate). A total of 9,910 presumptive TB clients were identified and evaluated for TB leading to diagnosis of 445 TB patients with TB (5% yield from presumptives) and placement of 435 TB patients on treatment (98% percentage enrolment). CONCLUSION: The TB Situation Room (TBSR) has been a great asset to Kano State TB Control Program in meeting up with the growing demand for timely data reporting in TB and other global health responses. The use of real time surveillance data during the 2023 NTBTW has in no small measure improved the TB response and feedback in Kano State. Scaling up this intervention to other disease areas, states and nations is a positive step in the right direction towards global TB eradication.

Keywords: tuberculosis (tb), national tb testing week (ntbtw), tb situation rom (tsr), information communication technology (ict)

Procedia PDF Downloads 79
22931 Sewage Sludge Management: A Case Study of Monrovia, Montserrado County, Liberia

Authors: Victor Emery David Jr, Md S. Hossain

Abstract:

Sewage sludge management has been a problem faced by most developing cities as in the case of Monrovia. The management of sewage sludge in Monrovia is still in its infant stage. The city is still struggling with poor sanitation, clogged pipes, shortage of septic tanks, lack of resources/human capacity, inadequate treatment facilities, open defecation, the absence of clear guidelines, etc. The rapid urban population growth of Monrovia has severely stressed Monrovia’s marginally functional urban WSS system caused by the civil conflict which led to break down in many sectors as well as infrastructure. The sewerage system which originally covered 17% of the population of Monrovia was down to serving about 7% because of bursts and blockages causing backflows in other areas. Prior to the Civil War, the average water production for Monrovia was about 68,000 m3/day but has now dropped to about 10,000 m3/day. Only small parts of Monrovia currently have direct access to the piped water supply while most areas depend on trucked water delivered to community collection points or household tanks, and/or on water from unprotected dug wells or hand pumps. There are only two functional treatment plants; The Fiamah Treatment plant and the White Plains Treatment Plant.

Keywords: Fiamah Treatment plant, management, Monrovia/Montserrado County, sewage, sludge

Procedia PDF Downloads 292
22930 Digital Twins: Towards an Overarching Framework for the Built Environment

Authors: Astrid Bagireanu, Julio Bros-Williamson, Mila Duncheva, John Currie

Abstract:

Digital Twins (DTs) have entered the built environment from more established industries like aviation and manufacturing, although there has never been a common goal for utilising DTs at scale. Defined as the cyber-physical integration of data between an asset and its virtual counterpart, DT has been identified in literature from an operational standpoint – in addition to monitoring the performance of a built asset. However, this has never been translated into how DTs should be implemented into a project and what responsibilities each project stakeholder holds in the realisation of a DT. What is needed is an approach to translate these requirements into actionable DT dimensions. This paper presents a foundation for an overarching framework specific to the built environment. For the purposes of this research, the UK widely used the Royal Institute of British Architects (RIBA) Plan of Work from 2020 is used as a basis for itemising project stages. The RIBA Plan of Work consists of eight stages designed to inform on the definition, briefing, design, coordination, construction, handover, and use of a built asset. Similar project stages are utilised in other countries; therefore, the recommendations from the interviews presented in this paper are applicable internationally. Simultaneously, there is not a single mainstream software resource that leverages DT abilities. This ambiguity meets an unparalleled ambition from governments and industries worldwide to achieve a national grid of interconnected DTs. For the construction industry to access these benefits, it necessitates a defined starting point. This research aims to provide a comprehensive understanding of the potential applications and ramifications of DT in the context of the built environment. This paper is an integral part of a larger research aimed at developing a conceptual framework for the Architecture, Engineering, and Construction (AEC) sector following a conventional project timeline. Therefore, this paper plays a pivotal role in providing practical insights and a tangible foundation for developing a stage-by-stage approach to assimilate the potential of DT within the built environment. First, the research focuses on a review of relevant literature, albeit acknowledging the inherent constraint of limited sources available. Secondly, a qualitative study compiling the views of 14 DT experts is presented, concluding with an inductive analysis of the interview findings - ultimately highlighting the barriers and strengths of DT in the context of framework development. As parallel developments aim to progress net-zero-centred design and improve project efficiencies across the built environment, the limited resources available to support DTs should be leveraged to propel the industry to reach its digitalisation era, in which AEC stakeholders have a fundamental role in understanding this, from the earliest stages of a project.

Keywords: digital twins, decision-making, design, net-zero, built environment

Procedia PDF Downloads 130
22929 Freshwater Pinch Analysis for Optimal Design of the Photovoltaic Powered-Pumping System

Authors: Iman Janghorban Esfahani

Abstract:

Due to the increased use of irrigation in agriculture, the importance and need for highly reliable water pumping systems have significantly increased. The pumping of the groundwater is essential to provide water for both drip and furrow irrigation to increase the agricultural yield, especially in arid regions that suffer from scarcities of surface water. The most common irrigation pumping systems (IPS) consume conventional energies through the use of electric motors and generators or connecting to the electricity grid. Due to the shortage and transportation difficulties of fossil fuels, and unreliable access to the electricity grid, especially in the rural areas, and the adverse environmental impacts of fossil fuel usage, such as greenhouse gas (GHG) emissions, the need for renewable energy sources such as photovoltaic systems (PVS) as an alternative way of powering irrigation pumping systems is urgent. Integration of the photovoltaic systems with irrigation pumping systems as the Photovoltaic Powered-Irrigation Pumping System (PVP-IPS) can avoid fossil fuel dependency and the subsequent greenhouse gas emissions, as well as ultimately lower energy costs and improve efficiency, which made PVP-IPS systems as an environmentally and economically efficient solution for agriculture irrigation in every region. The greatest problem faced by integration of PVP with IPS systems is matching the intermittence of the energy supply with the dynamic water demand. The best solution to overcome the intermittence is to incorporate a storage system into the PVP-IPS to provide water-on-demand as a highly reliable stand-alone irrigation pumping system. The water storage tank (WST) is the most common storage device for PVP-IPS systems. In the integrated PVP-IPS with a water storage tank (PVP-IPS-WST), a water storage tank stores the water pumped by the IPS in excess of the water demand and then delivers it when demands are high. The Freshwater pinch analysis (FWaPA) as an alternative to mathematical modeling was used by other researchers for retrofitting the off-grid battery less photovoltaic-powered reverse osmosis system. However, the Freshwater pinch analysis has not been used to integrate the photovoltaic systems with irrigation pumping system with water storage tanks. In this study, FWaPA graphical and numerical tools were used for retrofitting an existing PVP-IPS system located in Salahadin, Republic of Iraq. The plant includes a 5 kW submersible water pump and 7.5 kW solar PV system. The Freshwater Composite Curve as the graphical tool and Freashwater Storage Cascade Table as the numerical tool were constructed to determine the minimum required outsourced water during operation, optimal amount of delivered electricity to the water pump, and optimal size of the water storage tank for one-year operation data. The results of implementing the FWaPA on the case study show that the PVP-IPS system with a WST as the reliable system can reduce outsourced water by 95.41% compare to the PVP-IPS system without storage tank.

Keywords: irrigation, photovoltaic, pinch analysis, pumping, solar energy

Procedia PDF Downloads 140
22928 Exploring Cannabis for Cancer Symptom Relief: An Australian Perspective

Authors: Jenny Jin

Abstract:

Background: The therapeutic use of cannabis for cancer symptom control in Australia is gaining momentum, reflecting a broader global acceptance of its medicinal potential. Objective: This overview examines the historical context, current regulations, and clinical applications of cannabis in oncology within Australia. Methods: A historical analysis outlines the ancient and 19th-century medicinal uses of cannabis, followed by its prohibition in the early 20th century and subsequent resurgence in the late 20th century. The current legal framework under the therapeutic gods administration (TGA) is discussed. Results: Research indicates that cannabinoids, particularly THC and CBD, effectively alleviate pain, reduce chemotherapy-induced nausea and vomiting, stimulate appetite, and enhance overall quality of life for cancer patients. Despite these benefits, challenges such as dosing standardization, stigma, and access barriers persist. Conclusion: Continued clinical research, policy development, and educational initiatives are essential to optimize the use of cannabis in cancer care. A patient-centred approach, emphasizing interdisciplinary collaboration and informed decision-making, is crucial for improving therapeutic outcomes in this evolving field.

Keywords: historical context of cannabis, symptom control in oncology patients, therapeutic benefits, outcome and future

Procedia PDF Downloads 17
22927 Density Measurement of Mixed Refrigerants R32+R1234yf and R125+R290 from 0°C to 100°C and at Pressures up to 10 MPa

Authors: Xiaoci Li, Yonghua Huang, Hui Lin

Abstract:

Optimization of the concentration of components in mixed refrigerants leads to potential improvement of either thermodynamic cycle performance or safety performance of heat pumps and refrigerators. R32+R1234yf and R125+R290 are two promising binary mixed refrigerants for the application of heat pumps working in the cold areas. The p-ρ-T data of these mixtures are one of the fundamental and necessary properties for design and evaluation of the performance of the heat pumps. Although the property data of mixtures can be predicted by the mixing models based on the pure substances incorporated in programs such as the NIST database Refprop, direct property measurement will still be helpful to reveal the true state behaviors and verify the models. Densities of the mixtures of R32+R1234yf an d R125+R290 are measured by an Anton Paar U shape oscillating tube digital densimeter DMA-4500 in the range of temperatures from 0°C to 100 °C and pressures up to 10 MPa. The accuracy of the measurement reaches 0.00005 g/cm³. The experimental data are compared with the predictions by Refprop in the corresponding range of pressure and temperature.

Keywords: mixed refrigerant, density measurement, densimeter, thermodynamic property

Procedia PDF Downloads 299
22926 Mobile Systems: History, Technology, and Future

Authors: Shivendra Pratap Singh, Rishabh Sharma

Abstract:

The widespread adoption of mobile technology in recent years has revolutionized the way we communicate and access information. The evolution of mobile systems has been rapid and impactful, shaping our lives and changing the way we live and work. However, despite its significant influence, the history and development of mobile technology are not well understood by the general public. This research paper aims to examine the history, technology and future of mobile systems, exploring their evolution from early mobile phones to the latest smartphones and beyond. The study will analyze the technological advancements and innovations that have shaped the mobile industry, from the introduction of mobile internet and multimedia capabilities to the integration of artificial intelligence and 5G networks. Additionally, the paper will also address the challenges and opportunities facing the future of mobile technology, such as privacy concerns, battery life, and the increasing demand for high-speed internet. Finally, the paper will also provide insights into potential future developments and innovations in the mobile sector, such as foldable phones, wearable technology, and the Internet of Things (IoT). The purpose of this research paper is to provide a comprehensive overview of the history, technology, and future of mobile systems, shedding light on their impact on society and the challenges and opportunities that lie ahead.

Keywords: mobile technology, artificial intelligence, networking, iot, technological advancements, smartphones

Procedia PDF Downloads 98
22925 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: data envelopment analysis, interval DEA, efficiency classification, efficiency prediction

Procedia PDF Downloads 166
22924 Mapping of Traffic Noise in Riyadh City-Saudi Arabia

Authors: Khaled A. Alsaif, Mosaad A. Foda

Abstract:

The present work aims at development of traffic noise maps for Riyadh City using the software Lima. Road traffic data were estimated or measured as accurate as possible in order to obtain consistent noise maps. The predicted noise levels at some selected sites are validated by actual field measurements, which are obtained by a system that consists of a sound level meter, a GPS receiver and a database to manage the measured data. The maps show that noise levels remain over 50 dBA and can exceed 70 dBA at the nearside of major roads and highways.

Keywords: noise pollution, road traffic noise, LimA predictor, GPS

Procedia PDF Downloads 389
22923 The Introduction of a Tourniquet Checklist to Identify and Record Tourniquet Related Complications

Authors: Akash Soogumbur

Abstract:

Tourniquets are commonly used in orthopaedic surgery to provide hemostasis during procedures on the upper and lower limbs. However, there is a risk of complications associated with tourniquet use, such as nerve damage, skin necrosis, and compartment syndrome. The British Orthopaedic Association (BOAST) guidelines recommend the use of tourniquets at a pressure of 300 mmHg or less for a maximum of 2 hours. Research Aim: The aim of this study was to evaluate the effectiveness of a tourniquet checklist in improving compliance with the BOAST guidelines. Methodology: This was a retrospective study of all orthopaedic procedures performed at a single institution over a 12-month period. The study population included patients who had a tourniquet applied during surgery. Data were collected from the patients' medical records, including the duration of tourniquet use, the pressure used, and the method of exsanguination. Findings: The results showed that the use of the tourniquet checklist significantly improved compliance with the BOAST guidelines. Prior to the introduction of the checklist, compliance with the guidelines was 83% for the duration of tourniquet use and 73% for pressure used. After the introduction of the checklist, compliance increased to 100% for both duration of tourniquet use and pressure used. Theoretical Importance: The findings of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use. Data Collection: Data were collected from the patients' medical records. The data included the following information: Patient demographics, procedure performed, duration of tourniquet use, pressure used, method of exsanguination. Analysis Procedures: The data were analyzed using descriptive statistics. The compliance with the BOAST guidelines was calculated as the percentage of patients who met the guidelines for the duration of tourniquet use and pressure used. Question Addressed: The question addressed by this study was whether the use of a tourniquet checklist could improve compliance with the BOAST guidelines. Conclusion: The results of this study suggest that the use of a tourniquet checklist can be an effective way to improve compliance with the BOAST guidelines. This is important because it can help to reduce the risk of complications associated with tourniquet use.

Keywords: tourniquet, pressure, duration, complications, surgery

Procedia PDF Downloads 73
22922 Data Analysis for Taxonomy Prediction and Annotation of 16S rRNA Gene Sequences from Metagenome Data

Authors: Suchithra V., Shreedhanya, Kavya Menon, Vidya Niranjan

Abstract:

Skin metagenomics has a wide range of applications with direct relevance to the health of the organism. It gives us insight to the diverse community of microorganisms (the microbiome) harbored on the skin. In the recent years, it has become increasingly apparent that the interaction between skin microbiome and the human body plays a prominent role in immune system development, cancer development, disease pathology, and many other biological implications. Next Generation Sequencing has led to faster and better understanding of environmental organisms and their mutual interactions. This project is studying the human skin microbiome of different individuals having varied skin conditions. Bacterial 16S rRNA data of skin microbiome is downloaded from SRA toolkit provided by NCBI to perform metagenomics analysis. Twelve samples are selected with two controls, and 3 different categories, i.e., sex (male/female), skin type (moist/intermittently moist/sebaceous) and occlusion (occluded/intermittently occluded/exposed). Quality of the data is increased using Cutadapt, and its analysis is done using FastQC. USearch, a tool used to analyze an NGS data, provides a suitable platform to obtain taxonomy classification and abundance of bacteria from the metagenome data. The statistical tool used for analyzing the USearch result is METAGENassist. The results revealed that the top three abundant organisms found were: Prevotella, Corynebacterium, and Anaerococcus. Prevotella is known to be an infectious bacterium found on wound, tooth cavity, etc. Corynebacterium and Anaerococcus are opportunist bacteria responsible for skin odor. This result infers that Prevotella thrives easily in sebaceous skin conditions. Therefore it is better to undergo intermittently occluded treatment such as applying ointments, creams, etc. to treat wound for sebaceous skin type. Exposing the wound should be avoided as it leads to an increase in Prevotella abundance. Moist skin type individuals can opt for occluded or intermittently occluded treatment as they have shown to decrease the abundance of bacteria during treatment.

Keywords: bacterial 16S rRNA , next generation sequencing, skin metagenomics, skin microbiome, taxonomy

Procedia PDF Downloads 176
22921 Development of a Predictive Model to Prevent Financial Crisis

Authors: Tengqin Han

Abstract:

Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.

Keywords: delinquency, mortgage, model development, model validation

Procedia PDF Downloads 231
22920 Self-Supervised Learning for Hate-Speech Identification

Authors: Shrabani Ghosh

Abstract:

Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.

Keywords: attention learning, language model, offensive language detection, self-supervised learning

Procedia PDF Downloads 109
22919 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark

Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos

Abstract:

This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.

Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark

Procedia PDF Downloads 123
22918 The Developing of Teaching Materials Online for Students in Thailand

Authors: Pitimanus Bunlue

Abstract:

The objectives of this study were to identify the unique characteristics of Salaya Old market, Phutthamonthon, Nakhon Pathom and develop the effective video media to promote the homeland awareness among local people and the characteristic features of this community were collectively summarized based on historical data, community observation, and people’s interview. The acquired data were used to develop a media describing prominent features of the community. The quality of the media was later assessed by interviewing local people in the old market in terms of content accuracy, video, and narration qualities, and sense of homeland awareness after watching the video. The result shows a 6-minute video media containing historical data and outstanding features of this community was developed. Based on the interview, the content accuracy was good. The picture quality and the narration were very good. Most people developed a sense of homeland awareness after watching the video also as well.

Keywords: audio-visual, creating homeland awareness, Phutthamonthon Nakhon Pathom, research and development

Procedia PDF Downloads 295