Search results for: efficiency classification
5717 Syndromic Surveillance Framework Using Tweets Data Analytics
Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden
Abstract:
Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza
Procedia PDF Downloads 1165716 Key Principles and Importance of Applied Geomorphological Maps for Engineering Structure Placement
Authors: Sahar Maleki, Reza Shahbazi, Nayere Sadat Bayat Ghiasi
Abstract:
Applied geomorphological maps are crucial tools in engineering, particularly for the placement of structures. These maps provide precise information about the terrain, including landforms, soil types, and geological features, which are essential for making informed decisions about construction sites. The importance of these maps is evident in risk assessment, as they help identify potential hazards such as landslides, erosion, and flooding, enabling better risk management. Additionally, these maps assist in selecting the most suitable locations for engineering projects. Cost efficiency is another significant benefit, as proper site selection and risk assessment can lead to substantial cost savings by avoiding unsuitable areas and minimizing the need for extensive ground modifications. Ensuring the maps are accurate and up-to-date is crucial for reliable decision-making. Detailed information about various geomorphological features is necessary to provide a comprehensive overview. Integrating geomorphological data with other environmental and engineering data to create a holistic view of the site is one of the most fundamental steps in engineering. In summary, the preparation of applied geomorphological maps is a vital step in the planning and execution of engineering projects, ensuring safety, efficiency, and sustainability. In the Geological Survey of Iran, the preparation of these applied maps has enabled the identification and recognition of areas prone to geological hazards such as landslides, subsidence, earthquakes, and more. Additionally, areas with problematic soils, potential groundwater zones, and safe construction sites are identified and made available to the public.Keywords: geomorphological maps, geohazards, risk assessment, decision-making
Procedia PDF Downloads 235715 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: mutex task generation, data augmentation, meta-learning, text classification.
Procedia PDF Downloads 1435714 A Weighted Approach to Unconstrained Iris Recognition
Authors: Yao-Hong Tsai
Abstract:
This paper presents a weighted approach to unconstrained iris recognition. Nowadays, commercial systems are usually characterized by strong acquisition constraints based on the subject’s cooperation. However, it is not always achievable for real scenarios in our daily life. Researchers have been focused on reducing these constraints and maintaining the performance of the system by new techniques at the same time. With large variation in the environment, there are two main improvements to develop the proposed iris recognition system. For solving extremely uneven lighting condition, statistic based illumination normalization is first used on eye region to increase the accuracy of iris feature. The detection of the iris image is based on Adaboost algorithm. Secondly, the weighted approach is designed by Gaussian functions according to the distance to the center of the iris. Furthermore, local binary pattern (LBP) histogram is then applied to texture classification with the weight. Experiment showed that the proposed system provided users a more flexible and feasible way to interact with the verification system through iris recognition.Keywords: authentication, iris recognition, adaboost, local binary pattern
Procedia PDF Downloads 2255713 A Five-Year Experience of Intensity Modulated Radiotherapy in Nasopharyngeal Carcinomas in Tunisia
Authors: Omar Nouri, Wafa Mnejja, Fatma Dhouib, Syrine Zouari, Wicem Siala, Ilhem Charfeddine, Afef Khanfir, Leila Farhat, Nejla Fourati, Jamel Daoud
Abstract:
Purpose and Objective: Intensity modulated radiation (IMRT) technique, associated with induction chemotherapy (IC) and/or concomitant chemotherapy (CC), is actually the recommended treatment modality for nasopharyngeal carcinomas (NPC). The aim of this study was to evaluate the therapeutic results and the patterns of relapse with this treatment protocol. Material and methods: A retrospective monocentric study of 145 patients with NPC treated between June 2016 and July 2021. All patients received IMRT with integrated simultaneous boost (SIB) of 33 daily fractions at a dose of 69.96 Gy for high-risk volume, 60 Gy for intermediate risk volume and 54 Gy for low-risk volume. The high-risk volume dose was 66.5 Gy in children. Survival analysis was performed according to the Kaplan-Meier method, and the Log-rank test was used to compare factors that may influence survival. Results: Median age was 48 years (11-80) with a sex ratio of 2.9. One hundred-twenty tumors (82.7%) were classified as stages III-IV according to the 2017 UICC TNM classification. Ten patients (6.9%) were metastatic at diagnosis. One hundred-thirty-five patient (93.1%) received IC, 104 of which (77%) were TPF-based (taxanes, cisplatin and 5 fluoro-uracil). One hundred-thirty-eight patient (95.2%) received CC, mostly cisplatin in 134 cases (97%). After a median follow-up of 50 months [22-82], 46 patients (31.7%) had a relapse: 12 (8.2%) experienced local and/or regional relapse after a median of 18 months [6-43], 29 (20%) experienced distant relapse after a median of 9 months [2-24] and 5 patients (3.4%) had both. Thirty-five patients (24.1%) died, including 5 (3.4%) from a cause other than their cancer. Three-year overall survival (OS), cancer specific survival, disease free survival, metastasis free survival and loco-regional free survival were respectively 78.1%, 81.3%, 67.8%, 74.5% and 88.1%. Anatomo-clinic factors predicting OS were age > 50 years (88.7 vs. 70.5%; p=0.004), diabetes history (81.2 vs. 66.7%; p=0.027), UICC N classification (100 vs. 95 vs. 77.5 vs. 68.8% respectively for N0, N1, N2 and N3; p=0.008), the practice of a lymph node biopsy (84.2 vs. 57%; p=0.05), and UICC TNM stages III-IV (93.8 vs. 73.6% respectively for stage I-II vs. III-IV; p=0.044). Therapeutic factors predicting OS were a number of CC courses (less than 4 courses: 65.8 vs. 86%; p=0.03, less than 5 courses: 71.5 vs. 89%; p=0.041), a weight loss > 10% during treatment (84.1 vs. 60.9%; p=0.021) and a total cumulative cisplatin dose, including IC and CC, < 380 mg/m² (64.4 vs. 87.6%; p=0.003). Radiotherapy delay and total duration did not significantly affect OS. No grade 3-4 late side effects were noted in the evaluable 127 patients (87.6%). The most common toxicity was dry mouth which was grade 2 in 47 cases (37%) and grade 1 in 55 cases (43.3%).Conclusion: IMRT for nasopharyngeal carcinoma granted a high loco-regional control rate for patients during the last five years. However, distant relapses remain frequent and conditionate the prognosis. We identified many anatomo-clinic and therapeutic prognosis factors. Therefore, high-risk patients require a more aggressive therapeutic approach, such as radiotherapy dose escalation or adding adjuvant chemotherapy.Keywords: therapeutic results, prognostic factors, intensity-modulated radiotherapy, nasopharyngeal carcinoma
Procedia PDF Downloads 645712 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser
Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett
Abstract:
Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser
Procedia PDF Downloads 1565711 Modeling and Characterization of Organic LED
Authors: Bouanati Sidi Mohammed, N. E. Chabane Sari, Mostefa Kara Selma
Abstract:
It is well-known that Organic light emitting diodes (OLEDs) are attracting great interest in the display technology industry due to their many advantages, such as low price of manufacturing, large-area of electroluminescent display, various colors of emission included white light. Recently, there has been much progress in understanding the device physics of OLEDs and their basic operating principles. In OLEDs, Light emitting is the result of the recombination of electron and hole in light emitting layer, which are injected from cathode and anode. For improve luminescence efficiency, it is needed that hole and electron pairs exist affluently and equally and recombine swiftly in the emitting layer. The aim of this paper is to modeling polymer LED and OLED made with small molecules for studying the electrical and optical characteristics. The first simulation structures used in this paper is a mono layer device; typically consisting of the poly (2-methoxy-5(2’-ethyl) hexoxy-phenylenevinylene) (MEH-PPV) polymer sandwiched between an anode usually an indium tin oxide (ITO) substrate, and a cathode, such as Al. In the second structure we replace MEH-PPV by tris (8-hydroxyquinolinato) aluminum (Alq3). We choose MEH-PPV because of it's solubility in common organic solvents, in conjunction with a low operating voltage for light emission and relatively high conversion efficiency and Alq3 because it is one of the most important host materials used in OLEDs. In this simulation, the Poole-Frenkel- like mobility model and the Langevin bimolecular recombination model have been used as the transport and recombination mechanism. These models are enabled in ATLAS -SILVACO software. The influence of doping and thickness on I(V) characteristics and luminescence, are reported.Keywords: organic light emitting diode, polymer lignt emitting diode, organic materials, hexoxy-phenylenevinylene
Procedia PDF Downloads 5545710 Data-Centric Anomaly Detection with Diffusion Models
Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu
Abstract:
Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.Keywords: diffusion models, anomaly detection, data-centric, generative AI
Procedia PDF Downloads 825709 On the Solution of Boundary Value Problems Blended with Hybrid Block Methods
Authors: Kizito Ugochukwu Nwajeri
Abstract:
This paper explores the application of hybrid block methods for solving boundary value problems (BVPs), which are prevalent in various fields such as science, engineering, and applied mathematics. Traditionally, numerical approaches such as finite difference and shooting methods, often encounter challenges related to stability and convergence, particularly in the context of complex and nonlinear BVPs. To address these challenges, we propose a hybrid block method that integrates features from both single-step and multi-step techniques. This method allows for the simultaneous computation of multiple solution points while maintaining high accuracy. Specifically, we employ a combination of polynomial interpolation and collocation strategies to derive a system of equations that captures the behavior of the solution across the entire domain. By directly incorporating boundary conditions into the formulation, we enhance the stability and convergence properties of the numerical solution. Furthermore, we introduce an adaptive step-size mechanism to optimize performance based on the local behavior of the solution. This adjustment allows the method to respond effectively to variations in solution behavior, improving both accuracy and computational efficiency. Numerical tests on a variety of boundary value problems demonstrate the effectiveness of the hybrid block methods. These tests showcase significant improvements in accuracy and computational efficiency compared to conventional methods, indicating that our approach is robust and versatile. The results suggest that this hybrid block method is suitable for a wide range of applications in real-world problems, offering a promising alternative to existing numerical techniques.Keywords: hybrid block methods, boundary value problem, polynomial interpolation, adaptive step-size control, collocation methods
Procedia PDF Downloads 315708 Classification Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach
Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno
Abstract:
Banda Sea collision zone (BSCZ) of is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location in the eastern part of Indonesia. This zone has a very high seismic activity. In this research, we will be calculated rate (λ) and Mean Square Eror (MSE). By this result, we will identification of Poisson distribution of earthquakes in the BSCZ with the point process approach. Chi-square test approach and test Anscombe made in the process of identifying a Poisson distribution in the partition area. The data used are earthquakes with Magnitude ≥ 6 SR and its period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.Keywords: molluca banda sea collision zone, earthquakes, mean square error, poisson distribution, chi-square test, anscombe test
Procedia PDF Downloads 3005707 Transdermal Medicated- Layered Extended-Release Patches for Co-delivery of Carbamazepine and Pyridoxine
Authors: Sarah K. Amer, Walaa Alaa
Abstract:
Epilepsy is an important cause of mortality and morbidity, according to WHO statistics. It is characterized by the presence of frequent seizures occurring more than 24 hours apart. Carbamazepine (CBZ) is considered first-line treatment for epilepsy. However, reports have shown that CBZ oral formulations failed to achieve optimum systemic delivery, minimize side effects, and enhance patient compliance. Besides, the literature has signified the lack of therapeutically efficient CBZ transdermal formulation and the urge for its existence owing to its ease and convenient method of application and highlighted capability to attain higher bioavailability and more extended-release profiles compared to conventional oral CBZ tablets. This work aims to prepare CBZ microspheres (MS) that are embedded in a transdermal gel containing Vitamin B to be co-delivered. MS were prepared by emulsion-solvent diffusion method using Eudragit S as core forming polymer and hydroxypropyl methylcellulose (HPMC) polymer. The MS appeared to be spherical and porous in nature, offering a large surface area and high entrapment efficiency of CBZ. The transdermal gel was prepared by solvent-evaporation technique using HPMC that, offered high entrapment efficiency and Eudragit S that provided an extended-release profile. Polyethylene glycol, Span 80 and Pyridoxine were also added. Data indicated that combinations of CBZ with pyridoxine can reduce epileptic seizures without affecting motor coordination. Extended-release profiles were evident for this system. The patches were furthermore tested for thickness, moisture content, folding endurance, spreadability and viscosity measurements. This novel pharmaceutical formulation would be of great influence on seizure control, offering better therapeutic effects.Keywords: epilepsy, carbamazepine, pyridoxine, transdermal
Procedia PDF Downloads 595706 Errors in Selected Writings of EFL Students: A Study of Department of English, Taraba State University, Jalingo, Nigeria
Authors: Joy Aworookoroh
Abstract:
Writing is one of the active skills in language learning. Students of English as a foreign language are expected to write efficiently and proficiently in the language; however, there are usually challenges to optimal performance and competence in writing. Errors, on the other hand, in a foreign language learning situation are more positive than negative as they provide the basis for solving the limitations of the students. This paper investigates the situation in the Department of English, Taraba State University Jalingo. Students are administered a descriptive writing test across different levels of study. The target students are multilingual with an L1 of either Kuteb, Hausa or Junkun languages. The essays are accessed to identify the different kinds of errors in them alongside the classification of the order. Errors of correctness, clarity, engagement, and delivery were identified. However, the study identified that the degree of errors reduces alongside the experience and exposure of the students to an EFL classroom.Keywords: errors, writings, descriptive essay, multilingual
Procedia PDF Downloads 635705 Rethinking Riba in an Agency Theoretic Framework: Islamic Banking and Finance beyond Sophistry
Authors: Muhammad Arsalan
Abstract:
The efficiency of a financial intermediation system is assessed by its ability to achieve allocative efficiency, asset transformation, and the subsequent economic development. Islamic Banking and Finance (IBF) was conceived to serve as an alternate financial intermediation system adherent to the injunctions of Islam. A critical appraisal of the state of contemporary IBF reveals that it neither fulfills the aspirations of Islamic rhetoric nor is efficient in terms of asset transformation and economic development. This paper is an intuitive pursuit to explore the economic rationale of established principles of IBF, and the reasons of the persistent divergence of IBF being accused of ruses and sophistry. Disentangling the varying viewpoints, the underdevelopment of IBF has been attributed to misinterpretation of Riba, which has been explicated through a narrow fiqhi and legally deterministic approach. It presents a critical account of how incorrect conceptualization of the key injunction on Riba, steered flawed institutionalization of an Islamic Financial intermediation system. It also emphasizes on the wrong interpretation of the ontological and epistemological sources of Islamic Law (primarily Riba), that explains the perennial economic underdevelopment of the Muslim world. Deeming ‘a collaborative and dynamic Ijtihad’ as the elixir, this paper insists on the exigency of redefining Riba, i.e., a definition that incorporates the modern modes of economic cooperation and the contemporary financial intermediation ecosystem. Finally, Riba has been articulated in an agency theoretic framework to eschew expropriation of wealth, and assure protection of property rights, aimed at realizing the twin goals of a) Shari’ah adherence in true spirit, b) financial and economic development of the Muslim world.Keywords: agency theory, financial intermediation, Islamic banking and finance, ijtihad, economic development, Riba, information asymmetry
Procedia PDF Downloads 1395704 Simulation-Based Evaluation of Indoor Air Quality and Comfort Control in Non-Residential Buildings
Authors: Torsten Schwan, Rene Unger
Abstract:
Simulation of thermal and electrical building performance more and more becomes part of an integrative planning process. Increasing requirements on energy efficiency, the integration of volatile renewable energy, smart control and storage management often cause tremendous challenges for building engineers and architects. This mainly affects commercial or non-residential buildings. Their energy consumption characteristics significantly distinguish from residential ones. This work focuses on the many-objective optimization problem indoor air quality and comfort, especially in non-residential buildings. Based on a brief description of intermediate dependencies between different requirements on indoor air treatment it extends existing Modelica-based building physics models with additional system states to adequately represent indoor air conditions. Interfaces to corresponding HVAC (heating, ventilation, and air conditioning) system and control models enable closed-loop analyzes of occupants' requirements and energy efficiency as well as profitableness aspects. A complex application scenario of a nearly-zero-energy school building shows advantages of presented evaluation process for engineers and architects. This way, clear identification of air quality requirements in individual rooms together with realistic model-based description of occupants' behavior helps to optimize HVAC system already in early design stages. Building planning processes can be highly improved and accelerated by increasing integration of advanced simulation methods. Those methods mainly provide suitable answers on engineers' and architects' questions regarding more exuberant and complex variety of suitable energy supply solutions.Keywords: indoor air quality, dynamic simulation, energy efficient control, non-residential buildings
Procedia PDF Downloads 2325703 Detection of COVID-19 Cases From X-Ray Images Using Capsule-Based Network
Authors: Donya Ashtiani Haghighi, Amirali Baniasadi
Abstract:
Coronavirus (COVID-19) disease has spread abruptly all over the world since the end of 2019. Computed tomography (CT) scans and X-ray images are used to detect this disease. Different Deep Neural Network (DNN)-based diagnosis solutions have been developed, mainly based on Convolutional Neural Networks (CNNs), to accelerate the identification of COVID-19 cases. However, CNNs lose important information in intermediate layers and require large datasets. In this paper, Capsule Network (CapsNet) is used. Capsule Network performs better than CNNs for small datasets. Accuracy of 0.9885, f1-score of 0.9883, precision of 0.9859, recall of 0.9908, and Area Under the Curve (AUC) of 0.9948 are achieved on the Capsule-based framework with hyperparameter tuning. Moreover, different dropout rates are investigated to decrease overfitting. Accordingly, a dropout rate of 0.1 shows the best results. Finally, we remove one convolution layer and decrease the number of trainable parameters to 146,752, which is a promising result.Keywords: capsule network, dropout, hyperparameter tuning, classification
Procedia PDF Downloads 785702 The Potential in the Use of Building Information Modelling and Life-Cycle Assessment for Retrofitting Buildings: A Study Based on Interviews with Experts in Both Fields
Authors: Alex Gonzalez Caceres, Jan Karlshøj, Tor Arvid Vik
Abstract:
Life cycle of residential buildings are expected to be several decades, 40% of European residential buildings have inefficient energy conservation measure. The existing building represents 20-40% of the energy use and the CO₂ emission. Since net zero energy buildings are a short-term goal, (should be achieved by EU countries after 2020), is necessary to plan the next logical step, which is to prepare the existing outdated stack of building to retrofit them into an energy efficiency buildings. In order to accomplish this, two specialize and widespread tool can be used Building Information Modelling (BIM) and life-cycle assessment (LCA). BIM and LCA are tools used by a variety of disciplines; both are able to represent and analyze the constructions in different stages. The combination of these technologies could improve greatly the retrofitting techniques. The incorporation of the carbon footprint, introducing a single database source for different material analysis. To this is added the possibility of considering different analysis approaches such as costs and energy saving. Is expected with these measures, enrich the decision-making. The methodology is based on two main activities; the first task involved the collection of data this is accomplished by literature review and interview with experts in the retrofitting field and BIM technologies. The results of this task are presented as an evaluation checklist of BIM ability to manage data and improve decision-making in retrofitting projects. The last activity involves an evaluation using the results of the previous tasks, to check how far the IFC format can support the requirements by each specialist, and its uses by third party software. The result indicates that BIM/LCA have a great potential to improve the retrofitting process in existing buildings, but some modification must be done in order to meet the requirements of the specialists for both, retrofitting and LCA evaluators.Keywords: retrofitting, BIM, LCA, energy efficiency
Procedia PDF Downloads 2205701 An Automated Approach to Consolidate Galileo System Availability
Authors: Marie Bieber, Fabrice Cosson, Olivier Schmitt
Abstract:
Europe's Global Navigation Satellite System, Galileo, provides worldwide positioning and navigation services. The satellites in space are only one part of the Galileo system. An extensive ground infrastructure is essential to oversee the satellites and ensure accurate navigation signals. High reliability and availability of the entire Galileo system are crucial to continuously provide positioning information of high quality to users. Outages are tracked, and operational availability is regularly assessed. A highly flexible and adaptive tool has been developed to automate the Galileo system availability analysis. Not only does it enable a quick availability consolidation, but it also provides first steps towards improving the data quality of maintenance tickets used for the analysis. This includes data import and data preparation, with a focus on processing strings used for classification and identifying faulty data. Furthermore, the tool allows to handle a low amount of data, which is a major constraint when the aim is to provide accurate statistics.Keywords: availability, data quality, system performance, Galileo, aerospace
Procedia PDF Downloads 1675700 Design and Thermal Analysis of Power Harvesting System of a Hexagonal Shaped Small Spacecraft
Authors: Mansa Radhakrishnan, Anwar Ali, Muhammad Rizwan Mughal
Abstract:
Many universities around the world are working on modular and low budget architecture of small spacecraft to reduce the development cost of the overall system. This paper focuses on the design of a modular solar power harvesting system for a hexagonal-shaped small satellite. The designed solar power harvesting systems are composed of solar panels and power converter subsystems. The solar panel is composed of solar cells mounted on the external face of the printed circuit board (PCB), while the electronic components of power conversion are mounted on the interior side of the same PCB. The solar panel with dimensions 16.5cm × 99cm is composed of 36 solar cells (each solar cell is 4cm × 7cm) divided into four parallel banks where each bank consists of 9 solar cells. The output voltage of a single solar cell is 2.14V, and the combined output voltage of 9 series connected solar cells is around 19.3V. The output voltage of the solar panel is boosted to the satellite power distribution bus voltage level (28V) by a boost converter working on a constant voltage maximum power point tracking (MPPT) technique. The solar panel module is an eight-layer PCB having embedded coil in 4 internal layers. This coil is used to control the attitude of the spacecraft, which consumes power to generate a magnetic field and rotate the spacecraft. As power converter and distribution subsystem components are mounted on the PCB internal layer, therefore it is mandatory to do thermal analysis in order to ensure that the overall module temperature is within thermal safety limits. The main focus of the overall design is on compactness, miniaturization, and efficiency enhancement.Keywords: small satellites, power subsystem, efficiency, MPPT
Procedia PDF Downloads 745699 Enhance Construction Visual As-Built Schedule Management Using BIM Technology
Authors: Shu-Hui Jan, Hui-Ping Tserng, Shih-Ping Ho
Abstract:
Construction project control attempts to obtain real-time as-built schedule information and to eliminate project delays by effectively enhancing dynamic schedule control and management. Suitable platforms for enhancing an as-built schedule visually during the construction phase are necessary and important for general contractors. As the application of building information modeling (BIM) becomes more common, schedule management integrated with the BIM approach becomes essential to enhance visual construction management implementation for the general contractor during the construction phase. To enhance visualization of the updated as-built schedule for the general contractor, this study presents a novel system called the Construction BIM-assisted Schedule Management (ConBIM-SM) system for general contractors in
Keywords: building information modeling (BIM), construction schedule management, as-built schedule management, BIM schedule updating mechanism
Procedia PDF Downloads 3755698 Integrated Design of Froth Flotation Process in Sludge Oil Recovery Using Cavitation Nanobubbles for Increase the Efficiency and High Viscose Compatibility
Authors: Yolla Miranda, Marini Altyra, Karina Kalmapuspita Imas
Abstract:
Oily sludge wastes always fill in upstream and downstream petroleum industry process. Sludge still contains oil that can use for energy storage. Recycling sludge is a method to handling it for reduce the toxicity and very probable to get the remaining oil around 20% from its volume. Froth flotation, a common method based on chemical unit for separate fine solid particles from an aqueous suspension. The basic composition of froth flotation is the capture of oil droplets or small solids by air bubbles in an aqueous slurry, followed by their levitation and collection in a froth layer. This method has been known as no intensive energy requirement and easy to apply. But the low efficiency and unable treat the high viscosity become the biggest problem in froth flotation unit. This study give the design to manage the high viscosity of sludge first and then entering the froth flotation including cavitation tube on it to change the bubbles into nano particles. The recovery in flotation starts with the collision and adhesion of hydrophobic particles to the air bubbles followed by transportation of the hydrophobic particle-bubble aggregate from the collection zone to the froth zone, drainage and enrichment of the froth, and finally by its overflow removal from the cell top. The effective particle separation by froth flotation relies on the efficient capture of hydrophobic particles by air bubbles in three steps. The important step is collision. Decreasing the bubble particles will increasing the collision effect. It cause the process more efficient. The pre-treatment, froth flotation, and cavitation tube integrated each other. The design shows the integrated unit and its process.Keywords: sludge oil recovery, froth flotation, cavitation tube, nanobubbles, high viscosity
Procedia PDF Downloads 3785697 Seismic Vulnerability Analysis of Arch Dam Based on Response Surface Method
Authors: Serges Mendomo Meye, Li Guowei, Shen Zhenzhong
Abstract:
Earthquake is one of the main loads threatening dam safety. Once the dam is damaged, it will bring huge losses of life and property to the country and people. Therefore, it is very important to research the seismic safety of the dam. Due to the complex foundation conditions, high fortification intensity, and high scientific and technological content, it is necessary to adopt reasonable methods to evaluate the seismic safety performance of concrete arch dams built and under construction in strong earthquake areas. Structural seismic vulnerability analysis can predict the probability of structural failure at all levels under different intensity earthquakes, which can provide a scientific basis for reasonable seismic safety evaluation and decision-making. In this paper, the response surface method (RSM) is applied to the seismic vulnerability analysis of arch dams, which improves the efficiency of vulnerability analysis. Based on the central composite test design method, the material-seismic intensity samples are established. The response surface model (RSM) with arch crown displacement as performance index is obtained by finite element (FE) calculation of the samples, and then the accuracy of the response surface model (RSM) is verified. To obtain the seismic vulnerability curves, the seismic intensity measure ??(?1) is chosen to be 0.1~1.2g, with an interval of 0.1g and a total of 12 intensity levels. For each seismic intensity level, the arch crown displacement corresponding to 100 sets of different material samples can be calculated by algebraic operation of the response surface model (RSM), which avoids 1200 times of nonlinear dynamic calculation of arch dam; thus, the efficiency of vulnerability analysis is improved greatly.Keywords: high concrete arch dam, performance index, response surface method, seismic vulnerability analysis, vector-valued intensity measure
Procedia PDF Downloads 2405696 Adaptive Energy-Aware Routing (AEAR) for Optimized Performance in Resource-Constrained Wireless Sensor Networks
Authors: Innocent Uzougbo Onwuegbuzie
Abstract:
Wireless Sensor Networks (WSNs) are crucial for numerous applications, yet they face significant challenges due to resource constraints such as limited power and memory. Traditional routing algorithms like Dijkstra, Ad hoc On-Demand Distance Vector (AODV), and Bellman-Ford, while effective in path establishment and discovery, are not optimized for the unique demands of WSNs due to their large memory footprint and power consumption. This paper introduces the Adaptive Energy-Aware Routing (AEAR) model, a solution designed to address these limitations. AEAR integrates reactive route discovery, localized decision-making using geographic information, energy-aware metrics, and dynamic adaptation to provide a robust and efficient routing strategy. We present a detailed comparative analysis using a dataset of 50 sensor nodes, evaluating power consumption, memory footprint, and path cost across AEAR, Dijkstra, AODV, and Bellman-Ford algorithms. Our results demonstrate that AEAR significantly reduces power consumption and memory usage while optimizing path weight. This improvement is achieved through adaptive mechanisms that balance energy efficiency and link quality, ensuring prolonged network lifespan and reliable communication. The AEAR model's superior performance underlines its potential as a viable routing solution for energy-constrained WSN environments, paving the way for more sustainable and resilient sensor network deployments.Keywords: wireless sensor networks (WSNs), adaptive energy-aware routing (AEAR), routing algorithms, energy, efficiency, network lifespan
Procedia PDF Downloads 365695 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit
Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic
Abstract:
Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method
Procedia PDF Downloads 1205694 Efficient Feature Fusion for Noise Iris in Unconstrained Environment
Authors: Yao-Hong Tsai
Abstract:
This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.Keywords: image fusion, iris recognition, local binary pattern, wavelet
Procedia PDF Downloads 3675693 Improving University Operations with Data Mining: Predicting Student Performance
Authors: Mladen Dragičević, Mirjana Pejić Bach, Vanja Šimičević
Abstract:
The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.Keywords: data mining, knowledge discovery in databases, prediction models, student success
Procedia PDF Downloads 4075692 Oil Pollution Analysis of the Ecuadorian Rainforest Using Remote Sensing Methods
Authors: Juan Heredia, Naci Dilekli
Abstract:
The Ecuadorian Rainforest has been polluted for almost 60 years with little to no regard to oversight, law, or regulations. The consequences have been vast environmental damage such as pollution and deforestation, as well as sickness and the death of many people and animals. The aim of this paper is to quantify and localize the polluted zones, which something that has not been conducted and is the first step for remediation. To approach this problem, multi-spectral Remote Sensing imagery was utilized using a novel algorithm developed for this study, based on four normalized indices available in the literature. The algorithm classifies the pixels in polluted or healthy ones. The results of this study include a new algorithm for pixel classification and quantification of the polluted area in the selected image. Those results were finally validated by ground control points found in the literature. The main conclusion of this work is that using hyperspectral images, it is possible to identify polluted vegetation. The future work is environmental remediation, in-situ tests, and more extensive results that would inform new policymaking.Keywords: remote sensing, oil pollution quatification, amazon forest, hyperspectral remote sensing
Procedia PDF Downloads 1635691 Fabrication and Characterization of Folic Acid-Grafted-Thiomer Enveloped Liposomes for Enhanced Oral Bioavailability of Docetaxel
Authors: Farhan Sohail, Gul Shahnaz Irshad Hussain, Shoaib Sarwar, Ibrahim Javed, Zajif Hussain, Akhtar Nadhman
Abstract:
The present study was aimed to develop a hybrid nanocarrier (NC) system with enhanced membrane permeability, bioavailability and targeted delivery of Docetaxel (DTX) in breast cancer. Hybrid NC’s based on folic acid (FA) grafted thiolated chitosan (TCS) enveloped liposomes were prepared with DTX and evaluated in-vitro and in-vivo for their enhanced permeability and bioavailability. Physicochemical characterization of NC’s including particle size, morphology, zeta potential, FTIR, DSC, PXRD, encapsulation efficiency and drug release from NC’s was determined in vitro. Permeation enhancement and p-gp inhibition were performed through everted sac method on freshly excised rat intestine which indicated that permeation was enhanced 5 times as compared to pure DTX and the hybrid NC’s were strongly able to inhibit the p-gp activity as well. In-vitro cytotoxicity and tumor targeting was done using MDA-MB-231 cell line. The stability study of the formulations performed for 3 months showed the improved stability of FA-TCS enveloped liposomes in terms of its particles size, zeta potential and encapsulation efficiency as compared to TCS NP’s and liposomes. The pharmacokinetic study was performed in vivo using rabbits. The oral bioavailability and AUC0-96 was increased 10.07 folds with hybrid NC’s as compared to positive control. Half-life (t1/2) was increased 4 times (58.76 hrs) as compared to positive control (17.72 hrs). Conclusively, it is suggested that FA-TCS enveloped liposomes have strong potential to enhance permeability and bioavailability of hydrophobic drugs after oral administration and tumor targeting.Keywords: docetaxel, coated liposome, permeation enhancement, oral bioavailability
Procedia PDF Downloads 4085690 A Study of the Performance Parameter for Recommendation Algorithm Evaluation
Authors: C. Rana, S. K. Jain
Abstract:
The enormous amount of Web data has challenged its usage in efficient manner in the past few years. As such, a range of techniques are applied to tackle this problem; prominent among them is personalization and recommender system. In fact, these are the tools that assist user in finding relevant information of web. Most of the e-commerce websites are applying such tools in one way or the other. In the past decade, a large number of recommendation algorithms have been proposed to tackle such problems. However, there have not been much research in the evaluation criteria for these algorithms. As such, the traditional accuracy and classification metrics are still used for the evaluation purpose that provides a static view. This paper studies how the evolution of user preference over a period of time can be mapped in a recommender system using a new evaluation methodology that explicitly using time dimension. We have also presented different types of experimental set up that are generally used for recommender system evaluation. Furthermore, an overview of major accuracy metrics and metrics that go beyond the scope of accuracy as researched in the past few years is also discussed in detail.Keywords: collaborative filtering, data mining, evolutionary, clustering, algorithm, recommender systems
Procedia PDF Downloads 4145689 Assesing Spatio-Temporal Growth of Kochi City Using Remote Sensing Data
Authors: Navya Saira George, Patroba Achola Odera
Abstract:
This study aims to determine spatio-temporal expansion of Kochi City, situated on the west coast of Kerala State in India. Remote sensing and GIS techniques have been used to determine land use/cover and urban expansion of the City. Classification of Landsat images of the years 1973, 1988, 2002 and 2018 have been used to reproduce a visual story of the growth of the City over a period of 45 years. Accuracy range of 0.79 ~ 0.86 is achieved with kappa coefficient range of 0.69 ~ 0.80. Results show that the areas covered by vegetation and water bodies decreased progressively from 53.0 ~ 30.1% and 34.1 ~ 26.2% respectively, while built-up areas increased steadily from 12.5 to 42.2% over the entire study period (1973 ~ 2018). The shift in land use from agriculture to non-agriculture may be attributed to the land reforms since 1980s.Keywords: Geographical Information Systems, Kochi City, Land use/cover, Remote Sensing, Urban Sprawl
Procedia PDF Downloads 1295688 A Hybrid Model Tree and Logistic Regression Model for Prediction of Soil Shear Strength in Clay
Authors: Ehsan Mehryaar, Seyed Armin Motahari Tabari
Abstract:
Without a doubt, soil shear strength is the most important property of the soil. The majority of fatal and catastrophic geological accidents are related to shear strength failure of the soil. Therefore, its prediction is a matter of high importance. However, acquiring the shear strength is usually a cumbersome task that might need complicated laboratory testing. Therefore, prediction of it based on common and easy to get soil properties can simplify the projects substantially. In this paper, A hybrid model based on the classification and regression tree algorithm and logistic regression is proposed where each leaf of the tree is an independent regression model. A database of 189 points for clay soil, including Moisture content, liquid limit, plastic limit, clay content, and shear strength, is collected. The performance of the developed model compared to the existing models and equations using root mean squared error and coefficient of correlation.Keywords: model tree, CART, logistic regression, soil shear strength
Procedia PDF Downloads 197