Search results for: web based instruction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28579

Search results for: web based instruction

26449 Polymer Spiral Film Gas-Liquid Heat Exchanger for Waste Heat Recovery in Exhaust Gases

Authors: S. R. Parthiban, C. Elajchet Senni

Abstract:

Spiral heat exchangers are known as excellent heat exchanger because of far compact and high heat transfer efficiency. An innovative spiral heat exchanger based on polymer materials is designed for waste heat recovery process. Such a design based on polymer film technology provides better corrosion and chemical resistance compared to conventional metal heat exchangers. Due to the smooth surface of polymer film fouling is reduced. A new arrangement for flow of hot flue gas and cold fluid is employed for design, flue gas flows in axial path while the cold fluid flows in a spiral path. Heat load recovery achieved with the presented heat exchanger is in the range of 1.5 kW thermic but potential heat recovery about 3.5kW might be achievable. To measure the performance of the spiral tube heat exchanger, its model is suitably designed and fabricated so as to perform experimental tests. The paper gives analysis of spiral tube heat exchanger.

Keywords: spiral heat exchanger, polymer based materials, fouling factor, heat load

Procedia PDF Downloads 368
26448 Chemical Sensing Properties of Self-Assembled Film Based on an Amphiphilic Ambipolar Triple-Decker (Phthalocyaninato) (Porphyrinato) Europium Semiconductor

Authors: Kiran Abdullah, Yanli Chen

Abstract:

An amphiphilic mixed (phthalocyaninato) (porphyrinato) europium triple-decker complex Eu₂(Pc)₂(TPyP) has been synthesized and characterized. Introducing electron-withdrawing pyridyl substituents onto the meso-position of porphyrin ring in the triple-decker to ensure the sufficient hydrophilicity and suitable HOMO and LUMO energy levels and thus successfully realize amphiphilic ambipolar organic semiconductor. Importantly, high sensitive, reproducible p-type and n-type responses towards NH₃ andNO₂ respectively, based on the self-assembled film of the Eu₂(Pc)₂(TPyP) fabricated by a simple solution-based Quasi–Langmuir–Shäfer (QLS) method, have been first revealed. The good conductivity and crystallinity for the QLS film of Eu₂(Pc)₂(TPyP) render it excellent sensing property. This complex is sensitive to both electron-donating NH₃ gas in 5–30 ppm range and electron-accepting NO₂ gas 400–900 ppb range. Due to uniform nano particles there exist effective intermolecular interaction between triple decker molecules. This is the best result of Phthalocyanine–based chemical sensors at room temperature. Furthermore, the responses of the QLS film are all linearly correlated to both NH₃ and NO₂ with excellent sensitivity of 0.04% ppm⁻¹ and 31.9 % ppm⁻¹, respectively, indicating the great potential of semiconducting tetrapyrrole rare earth triple-decker compounds in the field of chemical sensors.

Keywords: ambipolar semiconductor, gas sensing, mixed (phthalocyaninato) (porphyrinato) rare earth complex, Self-assemblies

Procedia PDF Downloads 198
26447 Developing Serious Games to Improve Learning Experience of Programming: A Case Study

Authors: Shan Jiang, Xinyu Tang

Abstract:

Game-based learning is an emerging pedagogy to make the learning experience more effective, enjoyable, and fun. However, most games used in classroom settings have been overly simplistic. This paper presents a case study on a Python-based online game designed to improve the effectiveness in both teaching and research in higher education. The proposed game system not only creates a fun and enjoyable experience for students to learn various topics in programming but also improves the effectiveness of teaching in several aspects, including material presentation, helping students to recognize the importance of the subjects, and linking theoretical concepts to practice. The proposed game system also serves as an information cyber-infrastructure that automatically collects and stores data from players. The data could be useful in research areas including human-computer interaction, decision making, opinion mining, and artificial intelligence. They further provide other possibilities beyond these areas due to the customizable nature of the game.

Keywords: game-based learning, programming, research-teaching integration, Hearthstone

Procedia PDF Downloads 165
26446 Chatbots in Education: Case of Development Using a Chatbot Development Platform

Authors: Dulani Jayasuriya

Abstract:

This study outlines the developmental steps of a chatbot for administrative purposes of a large undergraduate course. The chatbot is able to handle student queries about administrative details, including assessment deadlines, course documentation, how to navigate the course, group formation, etc. The development window screenshots are that of a free account on the Snatchbot platform such that this can be adopted by the wider public. While only one connection to an answer based on possible keywords is shown here, one needs to develop multiple connections leading to different answers based on different keywords for the actual chatbot to function. The overall flow of the chatbot showing connections between different interactions is depicted at the end.

Keywords: chatbots, education, technology, snatch bot, artificial intelligence

Procedia PDF Downloads 104
26445 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: matrix minimization algorithm, decoding sequential search algorithm, image compression, DCT, DWT

Procedia PDF Downloads 150
26444 Influence of the Refractory Period on Neural Networks Based on the Recognition of Neural Signatures

Authors: José Luis Carrillo-Medina, Roberto Latorre

Abstract:

Experimental evidence has revealed that different living neural systems can sign their output signals with some specific neural signature. Although experimental and modeling results suggest that neural signatures can have an important role in the activity of neural networks in order to identify the source of the information or to contextualize a message, the functional meaning of these neural fingerprints is still unclear. The existence of cellular mechanisms to identify the origin of individual neural signals can be a powerful information processing strategy for the nervous system. We have recently built different models to study the ability of a neural network to process information based on the emission and recognition of specific neural fingerprints. In this paper we further analyze the features that can influence on the information processing ability of this kind of networks. In particular, we focus on the role that the duration of a refractory period in each neuron after emitting a signed message can play in the network collective dynamics.

Keywords: neural signature, neural fingerprint, processing based on signal identification, self-organizing neural network

Procedia PDF Downloads 492
26443 Conceptual Modeling of the Relationship between Project Management Practices and Knowledge Absorptive Capacity Using Interpretive Structural Modeling Method

Authors: Seyed Abdolreza Mosavi, Alireza Babakhan, Elham Sadat Hoseinifard

Abstract:

Knowledge-based firms need to design mechanisms for continuous absorptive and creation of knowledge in order to ensure their survival in the competitive arena and to follow the path of development. Considering the project-oriented nature of product development activities in knowledge-based firms on the one hand and the importance of analyzing the factors affecting knowledge absorptive capacity in these firms on the other, the purpose of this study is to identify and classify the factors affecting project management practices on absorptive knowledge capacity. For this purpose, we have studied and reviewed the theoretical literature in the field of project management and absorptive knowledge capacity so as to clarify its dimensions and indexes. Then, using the ISM method, the relationship between them has been studied. To collect data, 21 questionnaires were distributed in project-oriented knowledge-based companies. The results of the ISM method analysis provide a model for the relationship between project management activities and knowledge absorptive capacity, which includes knowledge acquisition capacity, scope management, time management, cost management, quality management, human resource management, communications management, procurement management, risk management, stakeholders management and integration management. Having conducted the MICMAC analysis, we divided the variables into three groups of independent, relational and dependent variables and came up with no variables to be included in the group of autonomous variables.

Keywords: knowledge absorptive capacity, project management practices, knowledge-based firms, interpretive structural modeling

Procedia PDF Downloads 197
26442 Expert Based System Design for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

Recently, an increasing number of researchers have been focusing on working out realistic solutions to sustainability problems. As sustainability issues gain higher importance for organisations, the management of such decisions becomes critical. Knowledge representation is a fundamental issue of complex knowledge based systems. Many types of sustainability problems would benefit from models based on experts’ knowledge. Cognitive maps have been used for analyzing and aiding decision making. A cognitive map can be made of almost any system or problem. A fuzzy cognitive map (FCM) can successfully represent knowledge and human experience, introducing concepts to represent the essential elements and the cause and effect relationships among the concepts to model the behavior of any system. Integrated waste management systems (IWMS) are complex systems that can be decomposed to non-related and related subsystems and elements, where many factors have to be taken into consideration that may be complementary, contradictory, and competitive; these factors influence each other and determine the overall decision process of the system. The goal of the present paper is to construct an efficient IWMS which considers various factors. The authors’ intention is to propose an expert based system design approach for implementing expert decision support in the area of IWMSs and introduces an appropriate methodology for the development and analysis of group FCM. A framework for such a methodology consisting of the development and application phases is presented.

Keywords: factors, fuzzy cognitive map, group decision, integrated waste management system

Procedia PDF Downloads 276
26441 Community Benefitting through Tourism: DASTA-Thailand Model

Authors: Jutamas Wisansing, Thanakarn Vongvisitsin, Udom Hongchatikul

Abstract:

Designated Areas for Sustainable Tourism Administration (DASTA) is a public organization, dedicating to sustainable tourism development in 6 designated areas in Thailand. This paper provides rich reflections from a decade of DASTA, formulating an advanced model to deepen our understanding of 2 key intertwining issues; 1) what are the new landscapes of actors for community based tourism and 2) who are the benefactors and beneficiaries of tourism development within the community? An action research approach was used, enabling the process and evidence-based cases to be better captured. The aim is to build theoretical foundation through 13 communities/cases, which have engaged in community based tourism pilot projects. Drawing from emic and qualitative research, specific and contextual phenomenon provides succinct patterns of ‘Community Benefitting through Tourism (CbtT)’ model. The re-definition of the 2 key issues helps shape the interlinking of actors; practicalities of inclusive tourism and inter-sectoral framework and its value chain will also be set forth. In tourism sector, community members could be active primarily on the supply side as employees, entrepreneurs and local heritage experts. CbtT when well defined stimulates the entire value chain of local economy while promoting social innovation through positive dialogue with wider actors. Collaboration with a new set of actors who are from the tourism-related businesses and non-tourism related businesses create better impacts on mutual benefits.

Keywords: community based tourism, community benefitting through tourism -CbtT DASTA model, sustainable tourism in thailand, value chain and inclusive business

Procedia PDF Downloads 299
26440 Latest Finding about Copper Sulfide Biomineralization and General Features of Metal Sulfide Biominerals

Authors: Yeseul Park

Abstract:

Biopolymers produced by organisms highly contribute to the production of metal sulfides, both in extracellular and intracellular biomineralization. We discovered a new type of intracellular biomineral composed of copper sulfide in the periplasm of a sulfate-reducing bacterium. We suggest that the structural features of biomineral composed of 1-2 nm subgrains are based on biopolymer-based capping agents and an organic compartment. We further compare with other types of metal sulfide biominerals.

Keywords: biomineralization, copper sulfide, metal sulfide, biopolymer, capping agent

Procedia PDF Downloads 112
26439 Spatial Relationship of Drug Smuggling Based on Geographic Information System Knowledge Discovery Using Decision Tree Algorithm

Authors: S. Niamkaeo, O. Robert, O. Chaowalit

Abstract:

In this investigation, we focus on discovering spatial relationship of drug smuggling along the northern border of Thailand. Thailand is no longer a drug production site, but Thailand is still one of the major drug trafficking hubs due to its topographic characteristics facilitating drug smuggling from neighboring countries. Our study areas cover three districts (Mae-jan, Mae-fahluang, and Mae-sai) in Chiangrai city and four districts (Chiangdao, Mae-eye, Chaiprakarn, and Wienghang) in Chiangmai city where drug smuggling of methamphetamine crystal and amphetamine occurs mostly. The data on drug smuggling incidents from 2011 to 2017 was collected from several national and local published news. Geo-spatial drug smuggling database was prepared. Decision tree algorithm was applied in order to discover the spatial relationship of factors related to drug smuggling, which was converted into rules using rule-based system. The factors including land use type, smuggling route, season and distance within 500 meters from check points were found that they were related to drug smuggling in terms of rules-based relationship. It was illustrated that drug smuggling was occurred mostly in forest area in winter. Drug smuggling exhibited was discovered mainly along topographic road where check points were not reachable. This spatial relationship of drug smuggling could support the Thai Office of Narcotics Control Board in surveillance drug smuggling.

Keywords: decision tree, drug smuggling, Geographic Information System, GIS knowledge discovery, rule-based system

Procedia PDF Downloads 169
26438 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 191
26437 Synchronous Reference Frame and Instantaneous P-Q Theory Based Control of Unified Power Quality Conditioner for Power Quality Improvement of Distribution System

Authors: Ambachew Simreteab Gebremedhn

Abstract:

Context: The paper explores the use of synchronous reference frame theory (SRFT) and instantaneous reactive power theory (IRPT) based control of Unified Power Quality Conditioner (UPQC) for improving power quality in distribution systems. Research Aim: To investigate the performance of different control configurations of UPQC using SRFT and IRPT for mitigating power quality issues in distribution systems. Methodology: The study compares three control techniques (SRFT-IRPT, SRFT-SRFT, IRPT-IRPT) implemented in series and shunt active filters of UPQC. Data is collected under various control algorithms to analyze UPQC performance. Findings: Results indicate the effectiveness of SRFT and IRPT based control techniques in addressing power quality problems such as voltage sags, swells, unbalance, harmonics, and current harmonics in distribution systems. Theoretical Importance: The study provides insights into the application of SRFT and IRPT in improving power quality, specifically in mitigating unbalanced voltage sags, where conventional methods fall short. Data Collection: Data is collected under various control algorithms using simulation in MATLAB Simulink and real-time operation executed with experimental results obtained using RT-LAB. Analysis Procedures: Performance analysis of UPQC under different control algorithms is conducted to evaluate the effectiveness of SRFT and IRPT based control techniques in mitigating power quality issues. Questions Addressed: How do SRFT and IRPT based control techniques compare in improving power quality in distribution systems? What is the impact of using different control configurations on the performance of UPQC? Conclusion: The study demonstrates the efficacy of SRFT and IRPT based control of UPQC in mitigating power quality issues in distribution systems, highlighting their potential for enhancing voltage and current quality.

Keywords: power quality, UPQC, shunt active filter, series active filter, non-linear load, RT-LAB, MATLAB

Procedia PDF Downloads 9
26436 Towards a Robust Patch Based Multi-View Stereo Technique for Textureless and Occluded 3D Reconstruction

Authors: Ben Haines, Li Bai

Abstract:

Patch based reconstruction methods have been and still are one of the top performing approaches to 3D reconstruction to date. Their local approach to refining the position and orientation of a patch, free of global minimisation and independent of surface smoothness, make patch based methods extremely powerful in recovering fine grained detail of an objects surface. However, patch based approaches still fail to faithfully reconstruct textureless or highly occluded surface regions thus though performing well under lab conditions, deteriorate in industrial or real world situations. They are also computationally expensive. Current patch based methods generate point clouds with holes in texturesless or occluded regions that require expensive energy minimisation techniques to fill and interpolate a high fidelity reconstruction. Such shortcomings hinder the adaptation of the methods for industrial applications where object surfaces are often highly textureless and the speed of reconstruction is an important factor. This paper presents on-going work towards a multi-resolution approach to address the problems, utilizing particle swarm optimisation to reconstruct high fidelity geometry, and increasing robustness to textureless features through an adapted approach to the normalised cross correlation. The work also aims to speed up the reconstruction using advances in GPU technologies and remove the need for costly initialization and expansion. Through the combination of these enhancements, it is the intention of this work to create denser patch clouds even in textureless regions within a reasonable time. Initial results show the potential of such an approach to construct denser point clouds with a comparable accuracy to that of the current top-performing algorithms.

Keywords: 3D reconstruction, multiview stereo, particle swarm optimisation, photo consistency

Procedia PDF Downloads 203
26435 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML

Procedia PDF Downloads 129
26434 Novel Nanomagnetic Beads Based- Latex Agglutination Assay for Rapid Diagnosis of Human Schistosomiasis Haematobium

Authors: Ibrahim Aly, Rabab Zalat, Bahaa EL Deen W. El Aswad, Ismail M. Moharm, Basam M. Masoud, Tarek Diab

Abstract:

The objective of the present study was to evaluate the novel nanomagnetic beads based–latex agglutination assay (NMB-LAT) as a simple test for diagnosis of S. haematobium as well as standardize the novel nanomagnetic beads based –ELISA (NMB-ELISA). According to urine examination this study included 85 S. haematobium infected patients, 30 other parasites infected patients and 25 negative control samples. The sensitivity of novel NMB-LAT was 82.4% versus 96.5% and 88.2% for NMB-ELISA and currently used sandwich ELISA respectively. The specificity of NMB-LAT was 83.6% versus 96.3% and 87.3% for NMB-ELISA and currently used sandwich ELISA respectively. In conclusion, the novel NMB-ELISA is a valuable applicable diagnostic technique for diagnosis of human schistosomiasis haematobium. The novel NMB-ELISA assay is a suitable applicable diagnostic method in field survey especially when followed by ELISA as a confirmatory test in query false negative results. Trials are required to increase the sensitivity and specificity of NMB-ELISA assay.

Keywords: diagnosis, iatex agglutination, nanomagnetic beads, sandwich ELISA

Procedia PDF Downloads 382
26433 Relative Navigation with Laser-Based Intermittent Measurement for Formation Flying Satellites

Authors: Jongwoo Lee, Dae-Eun Kang, Sang-Young Park

Abstract:

This study presents a precise relative navigational method for satellites flying in formation using laser-based intermittent measurement data. The measurement data for the relative navigation between two satellites consist of a relative distance measured by a laser instrument and relative attitude angles measured by attitude determination. The relative navigation solutions are estimated by both the Extended Kalman filter (EKF) and unscented Kalman filter (UKF). The solutions estimated by the EKF may become inaccurate or even diverge as measurement outage time gets longer because the EKF utilizes a linearization approach. However, this study shows that the UKF with the appropriate scaling parameters provides a stable and accurate relative navigation solutions despite the long measurement outage time and large initial error as compared to the relative navigation solutions of the EKF. Various navigation results have been analyzed by adjusting the scaling parameters of the UKF.

Keywords: satellite relative navigation, laser-based measurement, intermittent measurement, unscented Kalman filter

Procedia PDF Downloads 357
26432 Cryptocurrency-Based Mobile Payments with Near-Field Communication-Enabled Devices

Authors: Marko Niinimaki

Abstract:

Cryptocurrencies are getting increasingly popular, but very few of them can be conveniently used in daily mobile phone purchases. To solve this problem, we demonstrate how to build a functional prototype of a mobile cryptocurrency-based e-commerce application the communicates with Near-Field Communication (NFC) tags. Using the system, users are able to purchase physical items with an NFC tag that contains an e-commerce URL. The payment is done simply by touching the tag with a mobile device and accepting the payment. Our method is constructive: we describe the design and technologies used in the implementation and evaluate the security and performance of the solution. Our main finding is that the analysis and measurements show that our solution is feasible for e-commerce.

Keywords: cryptocurrency, e-commerce, NFC, mobile devices

Procedia PDF Downloads 184
26431 Capability of Available Seismic Soil Liquefaction Potential Assessment Models Based on Shear-Wave Velocity Using Banchu Case History

Authors: Nima Pirhadi, Yong Bo Shao, Xusheng Wa, Jianguo Lu

Abstract:

Several models based on the simplified method introduced by Seed and Idriss (1971) have been developed to assess the liquefaction potential of saturated sandy soils. The procedure includes determining the cyclic resistance of the soil as the cyclic resistance ratio (CRR) and comparing it with earthquake loads as cyclic stress ratio (CSR). Of all methods to determine CRR, the methods using shear-wave velocity (Vs) are common because of their low sensitivity to the penetration resistance reduction caused by fine content (FC). To evaluate the capability of the models, based on the Vs., the new data from Bachu-Jianshi earthquake case history collected, then the prediction results of the models are compared to the measured results; consequently, the accuracy of the models are discussed via three criteria and graphs. The evaluation demonstrates reasonable accuracy of the models in the Banchu region.

Keywords: seismic liquefaction, banchu-jiashi earthquake, shear-wave velocity, liquefaction potential evaluation

Procedia PDF Downloads 239
26430 Dosimetric Comparison of Conventional Optimization Methods with Inverse Planning Simulated Annealing Technique

Authors: Shraddha Srivastava, N. K. Painuly, S. P. Mishra, Navin Singh, Muhsin Punchankandy, Kirti Srivastava, M. L. B. Bhatt

Abstract:

Various optimization methods used in interstitial brachytherapy are based on dwell positions and dwell weights alteration to produce dose distribution based on the implant geometry. Since these optimization schemes are not anatomy based, they could lead to deviations from the desired plan. This study was henceforth carried out to compare anatomy-based Inverse Planning Simulated Annealing (IPSA) optimization technique with graphical and geometrical optimization methods in interstitial high dose rate brachytherapy planning of cervical carcinoma. Six patients with 12 CT data sets of MUPIT implants in HDR brachytherapy of cervical cancer were prospectively studied. HR-CTV and organs at risk (OARs) were contoured in Oncentra treatment planning system (TPS) using GYN GEC-ESTRO guidelines on cervical carcinoma. Three sets of plans were generated for each fraction using IPSA, graphical optimization (GrOPT) and geometrical optimization (GOPT) methods. All patients were treated to a dose of 20 Gy in 2 fractions. The main objective was to cover at least 95% of HR-CTV with 100% of the prescribed dose (V100 ≥ 95% of HR-CTV). IPSA, GrOPT, and GOPT based plans were compared in terms of target coverage, OAR doses, homogeneity index (HI) and conformity index (COIN) using dose-volume histogram (DVH). Target volume coverage (mean V100) was found to be 93.980.87%, 91.341.02% and 85.052.84% for IPSA, GrOPT and GOPT plans respectively. Mean D90 (minimum dose received by 90% of HR-CTV) values for IPSA, GrOPT and GOPT plans were 10.19 ± 1.07 Gy, 10.17 ± 0.12 Gy and 7.99 ± 1.0 Gy respectively, while D100 (minimum dose received by 100% volume of HR-CTV) for IPSA, GrOPT and GOPT plans was 6.55 ± 0.85 Gy, 6.55 ± 0.65 Gy, 4.73 ± 0.14 Gy respectively. IPSA plans resulted in lower doses to the bladder (D₂

Keywords: cervical cancer, HDR brachytherapy, IPSA, MUPIT

Procedia PDF Downloads 187
26429 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 489
26428 Syllogistic Reasoning with 108 Inference Rules While Case Quantities Change

Authors: Mikhail Zarechnev, Bora I. Kumova

Abstract:

A syllogism is a deductive inference scheme used to derive a conclusion from a set of premises. In a categorical syllogisms, there are only two premises and every premise and conclusion is given in form of a quantified relationship between two objects. The different order of objects in premises give classification known as figures. We have shown that the ordered combinations of 3 generalized quantifiers with certain figure provide in total of 108 syllogistic moods which can be considered as different inference rules. The classical syllogistic system allows to model human thought and reasoning with syllogistic structures always attracted the attention of cognitive scientists. Since automated reasoning is considered as part of learning subsystem of AI agents, syllogistic system can be applied for this approach. Another application of syllogistic system is related to inference mechanisms on the Semantic Web applications. In this paper we proposed the mathematical model and algorithm for syllogistic reasoning. Also the model of iterative syllogistic reasoning in case of continuous flows of incoming data based on case–based reasoning and possible applications of proposed system were discussed.

Keywords: categorical syllogism, case-based reasoning, cognitive architecture, inference on the semantic web, syllogistic reasoning

Procedia PDF Downloads 411
26427 Manage an Acute Pain Unit based on the Balanced Scorecard

Authors: Helena Costa Oliveira, Carmem Oliveira, Rita Moutinho

Abstract:

The Balanced Scorecard (BSC) is a continuous strategic monitoring model focused not only on financial issues but also on internal processes, patients/users, and learning and growth. Initially dedicated to business management, it currently serves organizations of other natures - such as hospitals. This paper presents a BSC designed for a Portuguese Acute Pain Unit (APU). This study is qualitative and based on the experience of collaborators at the APU. The management of APU is based on four perspectives – users, internal processes, learning and growth, and financial and legal. For each perspective, there were identified strategic objectives, critical factors, lead indicators and initiatives. The strategic map of the APU outlining sustained strategic relations among strategic objectives. This study contributes to the development of research in the health management area as it explores how organizational insufficiencies and inconsistencies in this particular case can be addressed, through the identification of critical factors, to clearly establish core outcomes and initiatives to set up.

Keywords: acute pain unit, balanced scorecard, hospital management, organizational performance, Portugal

Procedia PDF Downloads 148
26426 Mutiple Medical Landmark Detection on X-Ray Scan Using Reinforcement Learning

Authors: Vijaya Yuvaram Singh V M, Kameshwar Rao J V

Abstract:

The challenge with development of neural network based methods for medical is the availability of data. Anatomical landmark detection in the medical domain is a process to find points on the x-ray scan report of the patient. Most of the time this task is done manually by trained professionals as it requires precision and domain knowledge. Traditionally object detection based methods are used for landmark detection. Here, we utilize reinforcement learning and query based method to train a single agent capable of detecting multiple landmarks. A deep Q network agent is trained to detect single and multiple landmarks present on hip and shoulder from x-ray scan of a patient. Here a single agent is trained to find multiple landmark making it superior to having individual agents per landmark. For the initial study, five images of different patients are used as the environment and tested the agents performance on two unseen images.

Keywords: reinforcement learning, medical landmark detection, multi target detection, deep neural network

Procedia PDF Downloads 142
26425 A Two-Step Framework for Unsupervised Speaker Segmentation Using BIC and Artificial Neural Network

Authors: Ahmad Alwosheel, Ahmed Alqaraawi

Abstract:

This work proposes a new speaker segmentation approach for two speakers. It is an online approach that does not require a prior information about speaker models. It has two phases, a conventional approach such as unsupervised BIC-based is utilized in the first phase to detect speaker changes and train a Neural Network, while in the second phase, the output trained parameters from the Neural Network are used to predict next incoming audio stream. Using this approach, a comparable accuracy to similar BIC-based approaches is achieved with a significant improvement in terms of computation time.

Keywords: artificial neural network, diarization, speaker indexing, speaker segmentation

Procedia PDF Downloads 502
26424 Value of Willingness to Pay for a Quality-Adjusted Life Years Gained in Iran; A Modified Chained-Approach

Authors: Seyedeh-Fariba Jahanbin, Hasan Yusefzadeh, Bahram Nabilou, Cyrus Alinia, Cyrus Alinia

Abstract:

Background: Due to the lack of a constant Willingness to Pay per one additional Quality Adjusted Life Years gained based on the preferences of Iran’s general public, the cost-efectiveness of health system interventions is unclear and making it challenging to apply economic evaluation to health resources priority setting. Methods: We have measured this cost-efectiveness threshold with the participation of 2854 individuals from fve provinces, each representing an income quintile, using a modifed Time Trade-Of-based Chained-Approach. In this online-based empirical survey, to extract the health utility value, participants were randomly assigned to one of two green (21121) and yellow (22222) health scenarios designed based on the earlier validated EQ-5D-3L questionnaire. Results: Across the two health state versions, mean values for one QALY gain (rounded) ranged from $6740-$7400 and $6480-$7120, respectively, for aggregate and trimmed models, which are equivalent to 1.35-1.18 times of the GDP per capita. Log-linear Multivariate OLS regression analysis confrmed that respondents were more likely to pay if their income, disutility, and education level were higher than their counterparts. Conclusions: In the health system of Iran, any intervention that is with the incremental cost-efectiveness ratio, equal to and less than 7402.12 USD, will be considered cost-efective.

Keywords: willingness to Pay, QALY, chained-approach, cost-efectiveness threshold, Iran

Procedia PDF Downloads 85
26423 A Location-based Authentication and Key Management Scheme for Border Surveillance Wireless Sensor Networks

Authors: Walid Abdallah, Noureddine Boudriga

Abstract:

Wireless sensor networks have shown their effectiveness in the deployment of many critical applications especially in the military domain. Border surveillance is one of these applications where a set of wireless sensors are deployed along a country border line to detect illegal intrusion attempts to the national territory and report this to a control center to undergo the necessary measures. Regarding its nature, this wireless sensor network can be the target of many security attacks trying to compromise its normal operation. Particularly, in this application the deployment and location of sensor nodes are of great importance for detecting and tracking intruders. This paper proposes a location-based authentication and key distribution mechanism to secure wireless sensor networks intended for border surveillance where the key establishment is performed using elliptic curve cryptography and identity-based public key scheme. In this scheme, the public key of each sensor node will be authenticated by keys that depend on its position in the monitored area. Before establishing a pairwise key between two nodes, each one of them must verify the neighborhood location of the other node using a message authentication code (MAC) calculated on the corresponding public key and keys derived from encrypted beacon messages broadcast by anchor nodes. We show that our proposed public key authentication and key distribution scheme is more resilient to node capture and node replication attacks than currently available schemes. Also, the achievement of the key distribution between nodes in our scheme generates less communication overhead and hence increases network performances.

Keywords: wireless sensor networks, border surveillance, security, key distribution, location-based

Procedia PDF Downloads 660
26422 N-Type GaN Thinning for Enhancing Light Extraction Efficiency in GaN-Based Thin-Film Flip-Chip Ultraviolet (UV) Light Emitting Diodes (LED)

Authors: Anil Kawan, Soon Jae Yu, Jong Min Park

Abstract:

GaN-based 365 nm wavelength ultraviolet (UV) light emitting diodes (LED) have various applications: curing, molding, purification, deodorization, and disinfection etc. However, their usage is limited by very low output power, because of the light absorption in the GaN layers. In this study, we demonstrate a method utilizing removal of 365 nm absorption layer buffer GaN and thinning the n-type GaN so as to improve the light extraction efficiency of the GaN-based 365 nm UV LED. The UV flip chip LEDs of chip size 1.3 mm x 1.3 mm were fabricated using GaN epilayers on a sapphire substrate. Via-hole n-type contacts and highly reflective Ag metal were used for efficient light extraction. LED wafer was aligned and bonded to AlN carrier wafer. To improve the extraction efficiency of the flip chip LED, sapphire substrate and absorption layer buffer GaN were removed by using laser lift-off and dry etching, respectively. To further increase the extraction efficiency of the LED, exposed n-type GaN thickness was reduced by using inductively coupled plasma etching.

Keywords: extraction efficiency, light emitting diodes, n-GaN thinning, ultraviolet

Procedia PDF Downloads 426
26421 Real-Time Demonstration of Visible Light Communication Based on Frequency-Shift Keying Employing a Smartphone as the Receiver

Authors: Fumin Wang, Jiaqi Yin, Lajun Wang, Nan Chi

Abstract:

In this article, we demonstrate a visible light communication (VLC) system over 8 meters free space transmission based on a commercial LED and a receiver in connection with an audio interface of a smart phone. The signal is in FSK modulation format. The successful experimental demonstration validates the feasibility of the proposed system in future wireless communication network.

Keywords: visible light communication, smartphone communication, frequency shift keying, wireless communication

Procedia PDF Downloads 391
26420 Clustering Performance Analysis using New Correlation-Based Cluster Validity Indices

Authors: Nathakhun Wiroonsri

Abstract:

There are various cluster validity measures used for evaluating clustering results. One of the main objectives of using these measures is to seek the optimal unknown number of clusters. Some measures work well for clusters with different densities, sizes and shapes. Yet, one of the weaknesses that those validity measures share is that they sometimes provide only one clear optimal number of clusters. That number is actually unknown and there might be more than one potential sub-optimal option that a user may wish to choose based on different applications. We develop two new cluster validity indices based on a correlation between an actual distance between a pair of data points and a centroid distance of clusters that the two points are located in. Our proposed indices constantly yield several peaks at different numbers of clusters which overcome the weakness previously stated. Furthermore, the introduced correlation can also be used for evaluating the quality of a selected clustering result. Several experiments in different scenarios, including the well-known iris data set and a real-world marketing application, have been conducted to compare the proposed validity indices with several well-known ones.

Keywords: clustering algorithm, cluster validity measure, correlation, data partitions, iris data set, marketing, pattern recognition

Procedia PDF Downloads 103