Search results for: flight test data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30912

Search results for: flight test data

23802 Preparation and Characterization of CO-Tolerant Electrocatalyst for PEM Fuel Cell

Authors: Ádám Vass, István Bakos, Irina Borbáth, Zoltán Pászti, István Sajó, András Tompos

Abstract:

Important requirements for the anode side electrocatalysts of polymer electrolyte membrane (PEM) fuel cells are CO-tolerance, stability and corrosion resistance. Carbon is still the most common material for electrocatalyst supports due to its low cost, high electrical conductivity and high surface area, which can ensure good dispersion of the Pt. However, carbon becomes degraded at higher potentials and it causes problem during application. Therefore it is important to explore alternative materials with improved stability. Molybdenum-oxide can improve the CO-tolerance of the Pt/C catalysts, but it is prone to leach in acidic electrolyte. The Mo was stabilized by isovalent substitution of molybdenum into the rutile phase titanium-dioxide lattice, achieved by a modified multistep sol-gel synthesis method optimized for preparation of Ti0.7Mo.3O2-C composite. High degree of Mo incorporation into the rutile lattice was developed. The conductivity and corrosion resistance across the anticipated potential/pH window was ensured by mixed oxide – activated carbon composite. Platinum loading was carried out using NaBH4 and ethylene glycol; platinum content was 40 wt%. The electrocatalyst was characterized by both material investigating methods (i.e. XRD, TEM, EDS, XPS techniques) and electrochemical methods (cyclic-voltammetry, COads stripping voltammetry, hydrogen oxidation reaction on rotating disc electrode). The electrochemical activity of the sample was compared to commercial 40 wt% Pt/C (Quintech) and PtRu/C (Quintech, Pt= 20 wt%, Ru= 10 wt%) references. Enhanced CO tolerance of the electrocatalyst prepared using the Ti0.7Mo.3O2-C composite material was evidenced by the appearance of a CO-oxidation related 'pre-peak' and by the pronounced shift of the maximum of the main CO oxidation peak towards less positive potential compared to Pt/C. Fuel cell polarization measurements were also carried out using Bio-Logic and Paxitech FCT-150S test device. All details on the design, preparation, characterization and testing by both electrochemical measurements and fuel cell test device of electrocatalyst supported on Ti0.7Mo.3O2-C composite material will be presented and discussed.

Keywords: anode electrocatalyst, composite material, CO-tolerance, TiMoOx

Procedia PDF Downloads 296
23801 Seafloor and Sea Surface Modelling in the East Coast Region of North America

Authors: Magdalena Idzikowska, Katarzyna Pająk, Kamil Kowalczyk

Abstract:

Seafloor topography is a fundamental issue in geological, geophysical, and oceanographic studies. Single-beam or multibeam sonars attached to the hulls of ships are used to emit a hydroacoustic signal from transducers and reproduce the topography of the seabed. This solution provides relevant accuracy and spatial resolution. Bathymetric data from ships surveys provides National Centers for Environmental Information – National Oceanic and Atmospheric Administration. Unfortunately, most of the seabed is still unidentified, as there are still many gaps to be explored between ship survey tracks. Moreover, such measurements are very expensive and time-consuming. The solution is raster bathymetric models shared by The General Bathymetric Chart of the Oceans. The offered products are a compilation of different sets of data - raw or processed. Indirect data for the development of bathymetric models are also measurements of gravity anomalies. Some forms of seafloor relief (e.g. seamounts) increase the force of the Earth's pull, leading to changes in the sea surface. Based on satellite altimetry data, Sea Surface Height and marine gravity anomalies can be estimated, and based on the anomalies, it’s possible to infer the structure of the seabed. The main goal of the work is to create regional bathymetric models and models of the sea surface in the area of the east coast of North America – a region of seamounts and undulating seafloor. The research includes an analysis of the methods and techniques used, an evaluation of the interpolation algorithms used, model thickening, and the creation of grid models. Obtained data are raster bathymetric models in NetCDF format, survey data from multibeam soundings in MB-System format, and satellite altimetry data from Copernicus Marine Environment Monitoring Service. The methodology includes data extraction, processing, mapping, and spatial analysis. Visualization of the obtained results was carried out with Geographic Information System tools. The result is an extension of the state of the knowledge of the quality and usefulness of the data used for seabed and sea surface modeling and knowledge of the accuracy of the generated models. Sea level is averaged over time and space (excluding waves, tides, etc.). Its changes, along with knowledge of the topography of the ocean floor - inform us indirectly about the volume of the entire water ocean. The true shape of the ocean surface is further varied by such phenomena as tides, differences in atmospheric pressure, wind systems, thermal expansion of water, or phases of ocean circulation. Depending on the location of the point, the higher the depth, the lower the trend of sea level change. Studies show that combining data sets, from different sources, with different accuracies can affect the quality of sea surface and seafloor topography models.

Keywords: seafloor, sea surface height, bathymetry, satellite altimetry

Procedia PDF Downloads 76
23800 Mobi-DiQ: A Pervasive Sensing System for Delirium Risk Assessment in Intensive Care Unit

Authors: Subhash Nerella, Ziyuan Guan, Azra Bihorac, Parisa Rashidi

Abstract:

Intensive care units (ICUs) provide care to critically ill patients in severe and life-threatening conditions. However, patient monitoring in the ICU is limited by the time and resource constraints imposed on healthcare providers. Many critical care indices such as mobility are still manually assessed, which can be subjective, prone to human errors, and lack granularity. Other important aspects, such as environmental factors, are not monitored at all. For example, critically ill patients often experience circadian disruptions due to the absence of effective environmental “timekeepers” such as the light/dark cycle and the systemic effect of acute illness on chronobiologic markers. Although the occurrence of delirium is associated with circadian disruption risk factors, these factors are not routinely monitored in the ICU. Hence, there is a critical unmet need to develop systems for precise and real-time assessment through novel enabling technologies. We have developed the mobility and circadian disruption quantification system (Mobi-DiQ) by augmenting biomarker and clinical data with pervasive sensing data to generate mobility and circadian cues related to mobility, nightly disruptions, and light and noise exposure. We hypothesize that Mobi-DiQ can provide accurate mobility and circadian cues that correlate with bedside clinical mobility assessments and circadian biomarkers, ultimately important for delirium risk assessment and prevention. The collected multimodal dataset consists of depth images, Electromyography (EMG) data, patient extremity movement captured by accelerometers, ambient light levels, Sound Pressure Level (SPL), and indoor air quality measured by volatile organic compounds, and the equivalent CO₂ concentration. For delirium risk assessment, the system recognizes mobility cues (axial body movement features and body key points) and circadian cues, including nightly disruptions, ambient SPL, and light intensity, as well as other environmental factors such as indoor air quality. The Mobi-DiQ system consists of three major components: the pervasive sensing system, a data storage and analysis server, and a data annotation system. For data collection, six local pervasive sensing systems were deployed, including a local computer and sensors. A video recording tool with graphical user interface (GUI) developed in python was used to capture depth image frames for analyzing patient mobility. All sensor data is encrypted, then automatically uploaded to the Mobi-DiQ server through a secured VPN connection. Several data pipelines are developed to automate the data transfer, curation, and data preparation for annotation and model training. The data curation and post-processing are performed on the server. A custom secure annotation tool with GUI was developed to annotate depth activity data. The annotation tool is linked to the MongoDB database to record the data annotation and to provide summarization. Docker containers are also utilized to manage services and pipelines running on the server in an isolated manner. The processed clinical data and annotations are used to train and develop real-time pervasive sensing systems to augment clinical decision-making and promote targeted interventions. In the future, we intend to evaluate our system as a clinical implementation trial, as well as to refine and validate it by using other data sources, including neurological data obtained through continuous electroencephalography (EEG).

Keywords: deep learning, delirium, healthcare, pervasive sensing

Procedia PDF Downloads 88
23799 Residential Satisfaction and Public Perception of Socialized Housing Projects in Davao City, Philippines

Authors: Micah Amor P. Yares

Abstract:

Aside from the provision of adequate housing, the Philippine government faces the challenge of ensuring that the housing units provided conform to the Filipino’s ambition to self as manifested by owning a small house on a big lot. The study aimed to explore the levels of satisfaction of end-users and the public perception towards socialized housing in Davao City, Philippines. The residential satisfaction survey includes three types of respondents, which are end-users of single-detached, duplex and rowhouse socialized housing units. Respondents were asked to rate their level of satisfaction and perception to the following housing components: Dwelling Unit; Public Facilities; Social Environment; Neighborhood Facilities; Management Systems; and Acquisition and Financing. The data were subjected to Exploratory Factor Analysis to determine if variables can be grouped together, and Confirmatory Factor Analysis to measure if the model fits the construct. In determining which component affects the level of perception and satisfaction, a Multiple Linear Regression Analysis was employed. Lastly, an Individual Samples T-Test was performed to compare the levels of satisfaction and perception among respondents. Results revealed that residents of socialized housing were highly satisfied with their living conditions despite concerns on management systems, public and neighborhood facilities. Residents' satisfaction is primarily influenced by the Social Environment, Acquisition and Financing, and the Dwelling Unit. However, a significant difference in residential satisfaction level was observed among different types of housing with rowhouse residents recording the lowest satisfaction level compared to single-detached and duplex units. Moreover, the general public perceived Socialized housing as moderately satisfactory having the same determinant as the end-users aside from the Public Facilities. This study recommends revisiting the current Socialized Housing policies by considering the feedback from the end-users based on their lived experience and the public according to their perception.

Keywords: public perception, residential satisfaction, rowhouse, socialized housing

Procedia PDF Downloads 224
23798 Delineation of the Geoelectric and Geovelocity Parameters in the Basement Complex of Northwestern Nigeria

Authors: M. D. Dogara, G. C. Afuwai, O. O. Esther, A. M. Dawai

Abstract:

The geology of Northern Nigeria is under intense investigation particularly that of the northwest believed to be of the basement complex. The variability of the lithology is consistently inconsistent. Hence, the need for a close range study, it is, in view of the above that, two geophysical techniques, the vertical electrical sounding employing the Schlumberger array and seismic refraction methods, were used to delineate the geoelectric and geovelocity parameters of the basement complex of northwestern Nigeria. A total area of 400,000 m² was covered with sixty geoelectric stations established and sixty sets of seismic refraction data collected using the forward and reverse method. From the interpretation of the resistivity data, it is suggestive that the area is underlain by not more than five geoelectric layers of varying thicknesses and resistivities when a maximum half electrode spread of 100m was used. The result of the interpreted seismic data revealed two geovelocity layers, with velocities ranging between 478m/s to 1666m/s for the first layer and 1166m/s to 7141m/s for the second layer. The results of the two techniques, suggests that the area of study has an undulating bedrock topography with geoeletric and geovelocity layers composed of weathered rock materials.

Keywords: basement complex, delineation, geoelectric, geovelocity, Nigeria

Procedia PDF Downloads 145
23797 The Thinking of Dynamic Formulation of Rock Aging Agent Driven by Data

Authors: Longlong Zhang, Xiaohua Zhu, Ping Zhao, Yu Wang

Abstract:

The construction of mines, railways, highways, water conservancy projects, etc., have formed a large number of high steep slope wounds in China. Under the premise of slope stability and safety, the minimum cost, green and close to natural wound space repair, has become a new problem. Nowadays, in situ element testing and analysis, monitoring, field quantitative factor classification, and assignment evaluation will produce vast amounts of data. Data processing and analysis will inevitably differentiate the morphology, mineral composition, physicochemical properties between rock wounds, by which to dynamically match the appropriate techniques and materials for restoration. In the present research, based on the grid partition of the slope surface, tested the content of the combined oxide of rock mineral (SiO₂, CaO, MgO, Al₂O₃, Fe₃O₄, etc.), and classified and assigned values to the hardness and breakage of rock texture. The data of essential factors are interpolated and normalized in GIS, which formed the differential zoning map of slope space. According to the physical and chemical properties and spatial morphology of rocks in different zones, organic acids (plant waste fruit, fruit residue, etc.), natural mineral powder (zeolite, apatite, kaolin, etc.), water-retaining agent, and plant gum (melon powder) were mixed in different proportions to form rock aging agents. To spray the aging agent with different formulas on the slopes in different sections can affectively age the fresh rock wound, providing convenience for seed implantation, and reducing the transformation of heavy metals in the rocks. Through many practical engineering practices, a dynamic data platform of rock aging agent formula system is formed, which provides materials for the restoration of different slopes. It will also provide a guideline for the mixed-use of various natural materials to solve the complex, non-uniformity ecological restoration problem.

Keywords: data-driven, dynamic state, high steep slope, rock aging agent, wounds

Procedia PDF Downloads 108
23796 Adult Language Learning in the Institute of Technology Sector in the Republic of Ireland

Authors: Una Carthy

Abstract:

A recent study of third level institutions in Ireland reveals that both age and aptitude can be overcome by teaching methodologies to motivate second language learners. This PhD investigation gathered quantitative and qualitative data from 14 Institutes of Technology over a three years period from 2011 to 2014. The fundamental research question was to establish the impact of institutional language policy on attitudes towards language learning. However, other related issues around second language acquisition arose in the course of the investigation. Data were collected from both lectures and students, allowing interesting points of comparison to emerge from both datasets. Negative perceptions among lecturers regarding language provision were often associated with the view that language learning belongs to primary and secondary level and has no place in third level education. This perception was offset by substantial data showing positive attitudes towards adult language learning. Lenneberg’s Critical Age Theory postulated that the optimum age for learning a second language is before puberty. More recently, scholars have challenged this theory in their studies, revealing that mature learners can and do succeed at learning languages. With regard to aptitude, a preoccupation among lecturers regarding poor literacy skills among students emerged and was often associated with resistance to second language acquisition. This was offset by a preponderance of qualitative data from students highlighting the crucial role which teaching approaches play in the learning process. Interestingly, the data collected regarding learning disabilities reveals that, given the appropriate learning environments, individuals can be motivated to acquire second languages, and indeed succeed at learning them. These findings are in keeping with other recent studies regarding attitudes towards second language learning among students with learning disabilities. Both sets of findings reinforce the case for language policies in the Institute of Technology (IoTs). Supportive and positive learning environments can be created in third level institutions to motivate adult learners, thereby overcoming perceived obstacles relating to age and aptitude.

Keywords: age, aptitude, second language acquisition, teaching methodologies

Procedia PDF Downloads 121
23795 Cloud Monitoring and Performance Optimization Ensuring High Availability

Authors: Inayat Ur Rehman, Georgia Sakellari

Abstract:

Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.

Keywords: cloud computing, cloud monitoring, performance optimization, high availability, scalability, resource allocation, load balancing, auto-scaling, data security, data privacy

Procedia PDF Downloads 53
23794 Effects of Aerobic Dance Circuit Training Programme on Blood Pressure Variables of Obese Female College Students in Oyo State, Nigeria

Authors: Isiaka Oladele Oladipo, Olusegun Adewale Ajayi

Abstract:

The blood pressure fitness of female college students has been implicated in sedentary lifestyles. This study was designed to determine the effects of the Aerobic Dance Circuit Training Programme (ADCT) on blood pressure variables (Diastolic Blood Pressure (DBP) and Systolic Blood Pressure (SBP). Participants’ Pretest-Posttest control group quasi-experimental design using a 2x2x4 factorial matrix was adopted, while one (1) research question and two (2) research hypotheses were formulated. Seventy (70) untrained obese students-volunteers age 21.10±2.46 years were purposively selected from Oyo town, Nigeria; Emmanuel Alayande College of Education (experimental group and Federal College of Education (special) control group. The participants’ BMI, weight (kg), height (m), systolic bp(mmHg), and diastolic bp (mmHg) were measured before and completion of ADCT. Data collected were analysed using a pie chart, graph, percentage, mean, frequency, and standard deviation, while a t-test was used to analyse the stated hypotheses set at the critical level of 0.05. There were significant mean differences in baseline and post-treatment values of blood pressure variables in terms of SBP among the experimental group 136.49mmHg and 131.66mmHg; control group 130.82mmHg and 130.56mmHg (crit-t=2.00, cal.t=3.02, df=69, p<.0, the hypothesis was rejected; while DBP experimental group 88.65mmHg and 82.21mmHg; control group 69.91mmHg and 72.66mmHg (crit-t=2.00, cal.t=1.437, df=69, p>.05) in which the hypothesis was accepted). It was revealed from the findings that participants’ SBP decrease from week 4 to week 12 of ADCT indicated an effective reduction in blood pressure variables of obese female students. Therefore, the study confirmed that the use of ADCT is safe and effective in the management of blood pressure for the healthy benefit of obesity.

Keywords: aerobic dance circuit training, fitness lifestyles, obese college female students, systolic blood pressure, diastolic blood pressure

Procedia PDF Downloads 68
23793 The Use of Artificial Intelligence to Curb Corruption in Brazil

Authors: Camila Penido Gomes

Abstract:

Over the past decade, an emerging body of research has been pointing to artificial intelligence´s great potential to improve the use of open data, increase transparency and curb corruption in the public sector. Nonetheless, studies on this subject are scant and usually lack evidence to validate AI-based technologies´ effectiveness in addressing corruption, especially in developing countries. Aiming to fill this void in the literature, this paper sets out to examine how AI has been deployed by civil society to improve the use of open data and prevent congresspeople from misusing public resources in Brazil. Building on the current debates and carrying out a systematic literature review and extensive document analyses, this research reveals that AI should not be deployed as one silver bullet to fight corruption. Instead, this technology is more powerful when adopted by a multidisciplinary team as a civic tool in conjunction with other strategies. This study makes considerable contributions, bringing to the forefront discussion a more accurate understanding of the factors that play a decisive role in the successful implementation of AI-based technologies in anti-corruption efforts.

Keywords: artificial intelligence, civil society organization, corruption, open data, transparency

Procedia PDF Downloads 199
23792 Performance Study of Classification Algorithms for Consumer Online Shopping Attitudes and Behavior Using Data Mining

Authors: Rana Alaa El-Deen Ahmed, M. Elemam Shehab, Shereen Morsy, Nermeen Mekawie

Abstract:

With the growing popularity and acceptance of e-commerce platforms, users face an ever increasing burden in actually choosing the right product from the large number of online offers. Thus, techniques for personalization and shopping guides are needed by users. For a pleasant and successful shopping experience, users need to know easily which products to buy with high confidence. Since selling a wide variety of products has become easier due to the popularity of online stores, online retailers are able to sell more products than a physical store. The disadvantage is that the customers might not find products they need. In this research the customer will be able to find the products he is searching for, because recommender systems are used in some ecommerce web sites. Recommender system learns from the information about customers and products and provides appropriate personalized recommendations to customers to find the needed product. In this paper eleven classification algorithms are comparatively tested to find the best classifier fit for consumer online shopping attitudes and behavior in the experimented dataset. The WEKA knowledge analysis tool, which is an open source data mining workbench software used in comparing conventional classifiers to get the best classifier was used in this research. In this research by using the data mining tool (WEKA) with the experimented classifiers the results show that decision table and filtered classifier gives the highest accuracy and the lowest accuracy classification via clustering and simple cart.

Keywords: classification, data mining, machine learning, online shopping, WEKA

Procedia PDF Downloads 347
23791 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach

Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak

Abstract:

Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.

Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity

Procedia PDF Downloads 155
23790 The Potential of Potato and Maize Based Snacks as Fire Accelerants

Authors: E. Duffin, L. Brownlow

Abstract:

Arson is a crime which can provide exceptional problems to forensic specialists. Its destructive nature makes evidence much harder to find, especially when used to cover up another crime. There is a consistent potential threat of arsonists seeking new and easier ways to set fires. Existing research in this field primarily focuses on the use of accelerants such as petrol, with less attention to other more accessible and harder to detect materials. This includes the growing speculation of potato and maize-based snacks being used as fire accelerants. It was hypothesized that all ‘crisp-type’ snacks in foil packaging had the potential to act as accelerants and would burn readily in the various experiments. To test this hypothesis, a series of small lab-based experiments were undertaken, igniting samples of the snacks. Factors such as ingredients, shape, packaging and calorific value were all taken into consideration. The time (in seconds) spent on fire by the individual snacks was recorded. It was found that all of the snacks tested burnt for statistically similar amounts of time with a p-value of 0.0157. This was followed with a large mock real-life scenario using packets of crisps on fire and car seats to investigate as to the possibility of these snacks being verifiable tools to the arsonist. Here, three full packets of crisps were selected based on variations in burning during the lab experiments. They were each lit with a lighter to initiate burning, then placed onto a car seat to be timed and observed with video cameras. In all three cases, the fire was significant and sustained by the 200-second mark. On the basis of this data, it was concluded that potato and maize-based snacks were viable accelerants of fire. They remain an effective method of starting fires whilst being cheap, accessible, non-suspicious and non-detectable. The results produced supported the hypothesis that all ‘crisp-type’ snacks in foil packaging (that had been tested) had the potential to act as accelerants and would burn readily in the various experiments. This study serves to raise awareness and provide a basis for research and prevention of arson regarding maize and potato-based snacks as fire accelerants.

Keywords: arson, crisps, fires, food

Procedia PDF Downloads 120
23789 Applying Semi-Automatic Digital Aerial Survey Technology and Canopy Characters Classification for Surface Vegetation Interpretation of Archaeological Sites

Authors: Yung-Chung Chuang

Abstract:

The cultural layers of archaeological sites are mainly affected by surface land use, land cover, and root system of surface vegetation. For this reason, continuous monitoring of land use and land cover change is important for archaeological sites protection and management. However, in actual operation, on-site investigation and orthogonal photograph interpretation require a lot of time and manpower. For this reason, it is necessary to perform a good alternative for surface vegetation survey in an automated or semi-automated manner. In this study, we applied semi-automatic digital aerial survey technology and canopy characters classification with very high-resolution aerial photographs for surface vegetation interpretation of archaeological sites. The main idea is based on different landscape or forest type can easily be distinguished with canopy characters (e.g., specific texture distribution, shadow effects and gap characters) extracted by semi-automatic image classification. A novel methodology to classify the shape of canopy characters using landscape indices and multivariate statistics was also proposed. Non-hierarchical cluster analysis was used to assess the optimal number of canopy character clusters and canonical discriminant analysis was used to generate the discriminant functions for canopy character classification (seven categories). Therefore, people could easily predict the forest type and vegetation land cover by corresponding to the specific canopy character category. The results showed that the semi-automatic classification could effectively extract the canopy characters of forest and vegetation land cover. As for forest type and vegetation type prediction, the average prediction accuracy reached 80.3%~91.7% with different sizes of test frame. It represented this technology is useful for archaeological site survey, and can improve the classification efficiency and data update rate.

Keywords: digital aerial survey, canopy characters classification, archaeological sites, multivariate statistics

Procedia PDF Downloads 138
23788 Relational Attention Shift on Images Using Bu-Td Architecture and Sequential Structure Revealing

Authors: Alona Faktor

Abstract:

In this work, we present a NN-based computational model that can perform attention shifts according to high-level instruction. The instruction specifies the type of attentional shift using explicit geometrical relation. The instruction also can be of cognitive nature, specifying more complex human-human interaction or human-object interaction, or object-object interaction. Applying this approach sequentially allows obtaining a structural description of an image. A novel data-set of interacting humans and objects is constructed using a computer graphics engine. Using this data, we perform systematic research of relational segmentation shifts.

Keywords: cognitive science, attentin, deep learning, generalization

Procedia PDF Downloads 193
23787 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 470
23786 One-Class Classification Approach Using Fukunaga-Koontz Transform and Selective Multiple Kernel Learning

Authors: Abdullah Bal

Abstract:

This paper presents a one-class classification (OCC) technique based on Fukunaga-Koontz Transform (FKT) for binary classification problems. The FKT is originally a powerful tool to feature selection and ordering for two-class problems. To utilize the standard FKT for data domain description problem (i.e., one-class classification), in this paper, a set of non-class samples which exist outside of positive class (target class) describing boundary formed with limited training data has been constructed synthetically. The tunnel-like decision boundary around upper and lower border of target class samples has been designed using statistical properties of feature vectors belonging to the training data. To capture higher order of statistics of data and increase discrimination ability, the proposed method, termed one-class FKT (OC-FKT), has been extended to its nonlinear version via kernel machines and referred as OC-KFKT for short. Multiple kernel learning (MKL) is a favorable family of machine learning such that tries to find an optimal combination of a set of sub-kernels to achieve a better result. However, the discriminative ability of some of the base kernels may be low and the OC-KFKT designed by this type of kernels leads to unsatisfactory classification performance. To address this problem, the quality of sub-kernels should be evaluated, and the weak kernels must be discarded before the final decision making process. MKL/OC-FKT and selective MKL/OC-FKT frameworks have been designed stimulated by ensemble learning (EL) to weight and then select the sub-classifiers using the discriminability and diversities measured by eigenvalue ratios. The eigenvalue ratios have been assessed based on their regions on the FKT subspaces. The comparative experiments, performed on various low and high dimensional data, against state-of-the-art algorithms confirm the effectiveness of our techniques, especially in case of small sample size (SSS) conditions.

Keywords: ensemble methods, fukunaga-koontz transform, kernel-based methods, multiple kernel learning, one-class classification

Procedia PDF Downloads 12
23785 Smart Books as a Supporting Tool for Developing Skills of Designing and Employing Webquest 2.0

Authors: Huda Alyami

Abstract:

The present study aims to measure the effectiveness of an "Interactive eBook" in order to develop skills of designing and employing webquests for female intern teachers. The study uses descriptive analytical methodology as well as quasi-experimental methodology. The sample of the study consists of (30) female intern teachers from the Department of Special Education (in the tracks of Gifted Education and Learning Difficulties), during the first semester of the academic year 2015, at King Abdul-Aziz University in Jeddah city. The sample is divided into (15) female intern teachers for the experimental group, and (15) female intern teachers for the control group. A set of qualitative and quantitative tools have been prepared and verified for the study, embodied in: a list of the designing webquests' skills, a list of the employing webquests' skills, a webquests' knowledge achievement test, a product rating card, an observation card, and an interactive ebook. The study concludes the following results: 1. After pre-control, there are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of the experimental and the control groups in the post measurement of the webquests' knowledge achievement test, in favor of the experimental group. 2. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the product rating card in favor of the experimental group. 3. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the observation card for the experimental group. In the light of the previous findings, the study recommends the following: taking advantage of interactive ebooks when teaching all educational courses for various disciplines at the university level, creating educational participative platforms to share educational interactive ebooks for various disciplines at the local and regional levels. The study suggests conducting further qualitative studies on the effectiveness of interactive ebooks, in addition to conducting studies on the use of (Web 2.0) in webquests.

Keywords: interactive eBook, webquest, design, employing, develop skills

Procedia PDF Downloads 179
23784 A Simple Algorithm for Real-Time 3D Capturing of an Interior Scene Using a Linear Voxel Octree and a Floating Origin Camera

Authors: Vangelis Drosos, Dimitrios Tsoukalos, Dimitrios Tsolis

Abstract:

We present a simple algorithm for capturing a 3D scene (focused on the usage of mobile device cameras in the context of augmented/mixed reality) by using a floating origin camera solution and storing the resulting information in a linear voxel octree. Data is derived from cloud points captured by a mobile device camera. For the purposes of this paper, we assume a scene of fixed size (known to us or determined beforehand) and a fixed voxel resolution. The resulting data is stored in a linear voxel octree using a hashtable. We commence by briefly discussing the logic behind floating origin approaches and the usage of linear voxel octrees for efficient storage. Following that, we present the algorithm for translating captured feature points into voxel data in the context of a fixed origin world and storing them. Finally, we discuss potential applications and areas of future development and improvement to the efficiency of our solution.

Keywords: voxel, octree, computer vision, XR, floating origin

Procedia PDF Downloads 129
23783 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 48
23782 The Resistance of Fish Outside of Water Medium

Authors: Febri Ramadhan

Abstract:

Water medium is a vital necessity for the survival of fish. Fish can survive inside/outside of water medium within a certain time. By knowing the level of survival fish at outside of water medium, a person can transport the fish to a place with more efficiently. Transport of live fish from one place to another can be done with wet and dry media system. In this experiment the treatment-given the observed differences in fish species. This experiment aimed to test the degree of resilience of fish out of water media. Based on the ANOVA table is obtained, it can be concluded that the type of fish affects the level of resilience of fish outside the water (Fhit> Ftab).

Keywords: fish, transport, retention rate, fish resiliance

Procedia PDF Downloads 332
23781 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach

Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik

Abstract:

Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.

Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data

Procedia PDF Downloads 345
23780 Emergency Multidisciplinary Continuing Care Case Management

Authors: Mekroud Amel

Abstract:

Emergency departments are known for the workload, the variety of pathologies and the difficulties in their management with the continuous influx of patients The role of our service in the management of patients with two or three mild to moderate organ failures, involving several disciplines at the same time, as well as the effect of this management on the skills and efficiency of our team has been demonstrated Borderline cases between two or three or even more disciplines, with instability of a vital function, which have been successfully managed in the emergency room, the therapeutic procedures adopted, the consequences on the quality and level of care delivered by our team, as well as that the logistical consequences, and the pedagogical consequences are demonstrated. The consequences found are Positive on the emergency teams, in rare situations are negative Regarding clinical situations, it is the entanglement of hemodynamic distress with right, left or global participation, tamponade, low flow with acute pulmonary edema, and/or state of shock With respiratory distress with more or less profound hypoxemia, with haematosis disorder related to a bacterial or viral lung infection, pleurisy, pneumothorax, bronchoconstrictive crisis. With neurological disorders such as recent stroke, comatose state, or others With metabolic disorders such as hyperkalaemia renal insufficiency severe ionic disorders with accidents with anti vitamin K With or without septate effusion of one or more serous membranes with or without tamponade It’s a Retrospective, monocentric, descriptive study Period 05.01.2022 to 10.31.2022 the purpose of our work: Search for a statistically significant link between the type of moderate to severe pathology managed in the emergency room whose problems are multivisceral on the efficiency of the healthcare team and its level of care and optional care offered for patients Statistical Test used: Chi2 test to prove the significant link between the resolution of serious multidisciplinary cases in the emergency room and the effectiveness of the team in the management of complicated cases Search for a statistically significant link : The management of the most difficult clinical cases for organ specialties has given general practitioner emergency teams a great perspective and has been able to improve their efficiency in the face of emergencies received

Keywords: emergency care teams, management of patients with dysfunction of more than one organ, learning curve, quality of care

Procedia PDF Downloads 78
23779 Behavior Evaluation of an Anchored Wall

Authors: Polo G. Yohn Edison, Rocha F. Pedricto

Abstract:

This work presents a study about a retaining structure designed for the duplication of the rail FEPASA on the 74th km between Santos and São Paulo. This structure, an anchored retaining wall, was instrumented in the anchors heads with strain gauges in order to monitor its loads. The load measurements occurred during the performance test, locking and also after the works were concluded. A decrease on anchors loads is noticed at the moment immediately after the locking, during construction and after the works finished. It was observed that a loss of load in the anchors occurred to a maximum of 54%.

Keywords: instrumentation, strain gauges, retaining wall, anchors

Procedia PDF Downloads 489
23778 Text Emotion Recognition by Multi-Head Attention based Bidirectional LSTM Utilizing Multi-Level Classification

Authors: Vishwanath Pethri Kamath, Jayantha Gowda Sarapanahalli, Vishal Mishra, Siddhesh Balwant Bandgar

Abstract:

Recognition of emotional information is essential in any form of communication. Growing HCI (Human-Computer Interaction) in recent times indicates the importance of understanding of emotions expressed and becomes crucial for improving the system or the interaction itself. In this research work, textual data for emotion recognition is used. The text being the least expressive amongst the multimodal resources poses various challenges such as contextual information and also sequential nature of the language construction. In this research work, the proposal is made for a neural architecture to resolve not less than 8 emotions from textual data sources derived from multiple datasets using google pre-trained word2vec word embeddings and a Multi-head attention-based bidirectional LSTM model with a one-vs-all Multi-Level Classification. The emotions targeted in this research are Anger, Disgust, Fear, Guilt, Joy, Sadness, Shame, and Surprise. Textual data from multiple datasets were used for this research work such as ISEAR, Go Emotions, Affect datasets for creating the emotions’ dataset. Data samples overlap or conflicts were considered with careful preprocessing. Our results show a significant improvement with the modeling architecture and as good as 10 points improvement in recognizing some emotions.

Keywords: text emotion recognition, bidirectional LSTM, multi-head attention, multi-level classification, google word2vec word embeddings

Procedia PDF Downloads 172
23777 In Vitro Propagation of Vanilla Planifolia Using Nodal Explants and Varied Concentrations of Naphthaleneacetic acid (NAA) and 6-Benzylaminopurine (BAP).

Authors: Jessica Arthur, Duke Amegah, Kingsley Akenten Wiafe

Abstract:

Background: Vanilla planifolia is the only edible fruit of the orchid family (Orchidaceae) among the over 35,000 Orchidaceae species found worldwide. In Ghana, Vanilla was discovered in the wild, but it is underutilized for commercial production, most likely due to a lack of knowledge on the best NAA and BAP combinations for in vitro propagation to promote successfully regenerated plant acclimatization. The growing interest and global demand for elite Vanilla planifolia plants and natural vanilla flavour emphasize the need for an effective industrial-scale micropropagation protocol. Tissue culture systems are increasingly used to grow disease-free plants and reliable in vitro methods can also produce plantlets with typically modest proliferation rates. This study sought to develop an efficient protocol for in vitro propagation of vanilla using nodal explants by testing different concentrations of NAA and BAP, for the proliferation of the entire plant. Methods: Nodal explants with dormant axillary buds were obtained from year-old laboratory-grown Vanilla planifolia plants. MS media was prepared with a nutrient stock solution (containing macronutrients, micronutrients, iron solution and vitamins) and semi-solidified using phytagel. It was supplemented with different concentrations of NAA and BAP to induce multiple shoots and roots (0.5mg/L BAP with NAA at 0, 0.5, 1, 1.5, 2.0mg/L and vice-versa). The explants were sterilized, cultured in labelled test tubes and incubated at 26°C ± 2°C with 16/8 hours light/dark cycle. Data on shoot and root growth, leaf number, node number, and survival percentage were collected over three consecutive two-week periods. The data were square root transformed and subjected to ANOVA and LSD at a 5% significance level using the R statistical package. Results: Shoots emerged at 8 days and roots at 12 days after inoculation with 94% survival rate. It was discovered that for the NAA treatments, MS media supplemented with 2.00 mg/l NAA resulted in the highest shoot length (10.45cm), maximum root number (1.51), maximum shoot number (1.47) and the highest number of leaves (1.29). MS medium containing 1.00 mg/l NAA produced the highest number of nodes (1.62) and root length (14.27cm). Also, a similar growth pattern for the BAP treatments was observed. MS medium supplemented with 1.50 mg/l BAP resulted in the highest shoot length (14.98 cm), the highest number of nodes (4.60), the highest number of leaves (1.75) and the maximum shoot number (1.57). MS medium containing 0.50 mg/l BAP and 1.0 mg/l BAP generated a maximum root number (1.44) and the highest root length (13.25cm), respectively. However, the best concentration combination for maximizing shoot and root was media containing 1.5mg/l BAP combined with 0.5mg/l NAA, and 1.0mg/l NAA combined with 0.5mg/l of BAP respectively. These concentrations were optimum for in vitro growth and production of Vanilla planifolia. Significance: This study presents a standardized protocol for labs to produce clean vanilla plantlets, enhancing cultivation in Ghana and beyond. It provides insights into Vanilla planifolia's growth patterns and hormone responses, aiding future research and cultivation.

Keywords: Vanilla planifolia, In vitro propagation, plant hormones, MS media

Procedia PDF Downloads 60
23776 The Mass Attenuation Coefficients, Effective Atomic Cross Sections, Effective Atomic Numbers and Electron Densities of Some Halides

Authors: Shivalinge Gowda

Abstract:

The total mass attenuation coefficients m/r, of some halides such as, NaCl, KCl, CuCl, NaBr, KBr, RbCl, AgCl, NaI, KI, AgBr, CsI, HgCl2, CdI2 and HgI2 were determined at photon energies 279.2, 320.07, 514.0, 661.6, 1115.5, 1173.2 and 1332.5 keV in a well-collimated narrow beam good geometry set-up using a high resolution, hyper pure germanium detector. The mass attenuation coefficients and the effective atomic cross sections are found to be in good agreement with the XCOM values. From these mass attenuation coefficients, the effective atomic cross sections sa, of the compounds were determined. These effective atomic cross section sa data so obtained are then used to compute the effective atomic numbers Zeff. For this, the interpolation of total attenuation cross-sections of photons of energy E in elements of atomic number Z was performed by using the logarithmic regression analysis of the data measured by the authors and reported earlier for the above said energies along with XCOM data for standard energies. The best-fit coefficients in the photon energy range of 250 to 350 keV, 350 to 500 keV, 500 to 700 keV, 700 to 1000 keV and 1000 to 1500 keV by a piecewise interpolation method were then used to find the Zeff of the compounds with respect to the effective atomic cross section sa from the relation obtained by piece wise interpolation method. Using these Zeff values, the electron densities Nel of halides were also determined. The present Zeff and Nel values of halides are found to be in good agreement with the values calculated from XCOM data and other available published values.

Keywords: mass attenuation coefficient, atomic cross-section, effective atomic number, electron density

Procedia PDF Downloads 376
23775 Real-World Comparison of Adherence to and Persistence with Dulaglutide and Liraglutide in UAE e-Claims Database

Authors: Ibrahim Turfanda, Soniya Rai, Karan Vadher

Abstract:

Objectives— The study aims to compare real-world adherence to and persistence with dulaglutide and liraglutide in patients with type 2 diabetes (T2D) initiating treatment in UAE. Methods— This was a retrospective, non-interventional study (observation period: 01 March 2017–31 August 2019) using the UAE Dubai e-Claims database. Included: adult patients initiating dulaglutide/liraglutide 01 September 2017–31 August 2018 (index period) with: ≥1 claim for T2D in the 6 months before index date (ID); ≥1 claim for dulaglutide/liraglutide during index period; and continuous medical enrolment for ≥6 months before and ≥12 months after ID. Key endpoints, assessed 3/6/12 months after ID: adherence to treatment (proportion of days covered [PDC; PDC ≥80% considered ‘adherent’], per-group mean±standard deviation [SD] PDC); and persistence (number of continuous therapy days from ID until discontinuation [i.e., >45 days gap] or end of observation period). Patients initiating dulaglutide/liraglutide were propensity score matched (1:1) based on baseline characteristics. Between-group comparison of adherence was analysed using the McNemar test (α=0.025). Persistence was analysed using Kaplan–Meier estimates with log-rank tests (α=0.025) for between-group comparisons. This study presents 12-month outcomes. Results— Following propensity score matching, 263 patients were included in each group. Mean±SD PDC for all patients at 12 months was significantly higher in the dulaglutide versus the liraglutide group (dulaglutide=0.48±0.30, liraglutide=0.39±0.28, p=0.0002). The proportion of adherent patients favored dulaglutide (dulaglutide=20.2%, liraglutide=12.9%, p=0.0302), as did the probability of being adherent to treatment (odds ratio [97.5% CI]: 1.70 [0.99, 2.91]; p=0.03). Proportion of persistent patients also favoured dulaglutide (dulaglutide=15.2%, liraglutide=9.1%, p=0.0528), as did the probability of discontinuing treatment 12 months after ID (p=0.027). Conclusions— Based on the UAE Dubai e-Claims database data, dulaglutide initiators exhibited significantly greater adherence in terms of mean PDC versus liraglutide initiators. The proportion of adherent patients and the probability of being adherent favored the dulaglutide group, as did treatment persistence.

Keywords: adherence, dulaglutide, effectiveness, liraglutide, persistence

Procedia PDF Downloads 120
23774 The Application of Whole-Cell Luminescent Biosensors for Assessing Bactericidal Properties of Medicinal Plants

Authors: Yuliya Y. Gavrichenko

Abstract:

Background and Aims: The increasing bacterial resistance to almost all the available antibiotics has encouraged scientists to search for alternative sources of antibacterial agents. Nowadays, it is known that many plant secondary metabolites have diverse biological activity. These compounds can be potentially active against human bacterial and viral infections. Extended research has been carried out to explore the use of the luminescent bacterial test as a rapid, accurate and inexpensive method to assess the antibacterial properties and to predict the biological activity spectra for plant origin substances. Method: Botanical material of fifteen species was collected from their natural and cultural habitats on the Crimean peninsula. The aqueous extracts of following plants were tested: Robinia pseudoacacia L., Sideritis comosa, Cotinus coggygria Scop., Thymus serpyllum L., Juglans regia L., Securigera varia L., Achillea millefolium L., Phlomis taurica, Corylus avellana L., Sambucus nigra L., Helichrysum arenarium L., Glycyrrhiza glabra L., Elytrigia repens L., Echium vulgare L., Conium maculatum L. The test was carried out using luminous strains of marine bacteria Photobacterium leiognathi, which was isolated from the Sea of Azov as well as four Escherichia coli MG1655 recombinant strains harbouring Vibrio fischeri luxCDABE genes. Results: The bactericidal capacity of plant extracts showed significant differences in the study. Cotinus coggygria, Phlomis taurica, Juglans regia L. proved to be the most toxic to P. leiognathi. (EC50 = 0.33 g dried plant/l). Glycyrrhiza glabra L., Robinia pseudoacacia L., Sideritis comosa and Helichrysum arenarium L. had moderate inhibitory effects (EC50 = 3.3 g dried plant/l). The rest of the aqueous extracts have decreased the luminescence of no more than 50% at the lowest concentration (16.5 g dried plant/l). Antibacterial activity of herbal extracts against constitutively luminescent E. coli MG1655 (pXen7-lux) strain was observed at approximately the same level as for P. leiognathi. Cotinus coggygria and Conium maculatum L. extracts have increased light emission in the mutant E. coli MG1655 (pFabA-lux) strain which is associated with cell membranes damage. Sideritis comosa, Phlomis taurica, Juglans regia induced SOS response in E. coli (pColD-lux) strain. Glycyrrhiza glabra L. induced protein damage response in E. coli MG1655 (pIbpA-lux) strain. Conclusion: The received results have shown that the plants’ extracts had nonspecific antimicrobial effects against both E. coli (pXen7-lux) and P. leiognathi biosensors. Mutagenic, cytotoxic and protein damage effects have been observed. In general, the bioluminescent inhibition test result correlated with the traditional use of screened plants. It leads to the following conclusion that whole-cell luminescent biosensors could be the indicator of overall plants antibacterial capacity. The results of the investigation have shown a possibility of bioluminescent method in medicine and pharmacy as an approach to research the antibacterial properties of medicinal plants.

Keywords: antibacterial property, bioluminescence, medicinal plants, whole-cell biosensors

Procedia PDF Downloads 118
23773 Developing a Culturally Acceptable End of Life Survey (the VOICES-ESRD/Thai Questionnaire) for Evaluation Health Services Provision of Older Persons with End-Stage Renal Disease (ESRD) in Thailand

Authors: W. Pungchompoo, A. Richardson, L. Brindle

Abstract:

Background: The developing of a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire) is an essential instrument for evaluation health services provision of older persons with ESRD in Thailand. The focus of the questionnaire was on symptoms, symptom control and the health care needs of older people with ESRD who are managed without dialysis. Objective: The objective of this study was to develop and adapt VOICES to make it suitable for use in a population survey in Thailand. Methods: The mixed methods exploratory sequential design was focussed on modifying an instrument. Data collection: A cognitive interviewing technique was implemented, using two cycles of data collection with a sample of 10 bereaved carers and a prototype of the Thai VOICES questionnaire. Qualitative study was used to modify the developing a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire). Data analysis: The data were analysed by using content analysis. Results: The revisions to the prototype questionnaire were made. The results were used to adapt the VOICES questionnaire for use in a population-based survey with older ESRD patients in Thailand. Conclusions: A culturally specific questionnaire was generated during this second phase and issues with questionnaire design were rectified.

Keywords: VOICES-ESRD/Thai questionnaire, cognitive interviewing, end of life survey, health services provision, older persons with ESRD

Procedia PDF Downloads 279