Search results for: data structure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30356

Search results for: data structure

26306 Development of the Internal Educational Quality Assurance System of Suan Sunandha Rajabhat University

Authors: Nipawan Tharasak, Sajeewan Darbavasu

Abstract:

This research aims 1) to study the opinion, problems and obstacles to internal educational quality assurance system for individual and the university levels, 2) to propose an approach to the development of quality assurance system of Suan Sunandha Rajabhat University. A study of problems and obstacles to internal educational quality assurance system of the university conducted with sample group consisting of staff and quality assurance committee members of the year 2010. There were 152 respondents. 5 executives were interviewed. Tool used in the research was document analysis. The structure of the interview questions and questionnaires with 5-rate scale. Reliability was 0.981. Data analysis were percentage, mean and standard deviation with content analysis. Results can be divided into 3 main points: (1) The implementation of the internal quality assurance system of the university. It was found that in overall, input, process and output factors received high scores. Each item is considered, the preparation, planning, monitoring and evaluation. The results of evaluation to improve the reporting and improvement according to an evaluation received high scores. However, the process received an average score. (2) Problems and obstacles. It was found that the personnel responsible for the duty still lack understanding of indicators and criteria of the quality assurance. (3) Development approach: -Staff should be encouraged to develop a better understanding of the quality assurance system. -Database system for quality assurance should be developed. -The results and suggestions should be applied in the next year development planning.

Keywords: development system, internal quality assurance, education, educational quality assurance

Procedia PDF Downloads 279
26305 Design of Labview Based DAQ System

Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid

Abstract:

The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.

Keywords: data acquisition, labview, signal conditioning, national instruments

Procedia PDF Downloads 483
26304 An Analysis of Public Environmental Investment on the Sustainable Development in China

Authors: K. Y. Chen, Y. N. Jia, H. Chua, C. W. Kan

Abstract:

As the largest developing country in the world, China is now facing the problem arising from the environment. Thus, China government increases the environmental investment yearly. In this study, we will analyse the effect of the public environmental investment on the sustainable development in China. Firstly, we will review the current situation of China's environmental issue. Secondly, we will collect the yearly environmental data as well as the information of public environmental investment. Finally, we will use the collected data to analyse and project the SWOT of public environmental investment in China. Therefore, the aim of this paper is to provide the relationship between public environmental investment and sustainable development in China. Based on the data collected, it was revealed that the public environmental investment had a positive impact on the sustainable development in China as well as the GDP growth. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: China, public environmental investment, sustainable development, analysis

Procedia PDF Downloads 348
26303 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 170
26302 Topology Optimization Design of Transmission Structure in Flapping-Wing Micro Aerial Vehicle via 3D Printing

Authors: Zuyong Chen, Jianghao Wu, Yanlai Zhang

Abstract:

Flapping-wing micro aerial vehicle (FMAV) is a new type of aircraft by mimicking the flying behavior to that of small birds or insects. Comparing to the traditional fixed wing or rotor-type aircraft, FMAV only needs to control the motion of flapping wings, by changing the size and direction of lift to control the flight attitude. Therefore, its transmission system should be designed very compact. Lightweight design can effectively extend its endurance time, while engineering experience alone is difficult to simultaneously meet the requirements of FMAV for structural strength and quality. Current researches still lack the guidance of considering nonlinear factors of 3D printing material when carrying out topology optimization, especially for the tiny FMAV transmission system. The coupling of non-linear material properties and non-linear contact behaviors of FMAV transmission system is a great challenge to the reliability of the topology optimization result. In this paper, topology optimization design based on FEA solver package Altair Optistruct for the transmission system of FMAV manufactured by 3D Printing was carried out. Firstly, the isotropic constitutive behavior of the Ultraviolet (UV) Cureable Resin used to fabricate the structure of FMAV was evaluated and confirmed through tensile test. Secondly, a numerical computation model describing the mechanical behavior of FMAV transmission structure was established and verified by experiments. Then topology optimization modeling method considering non-linear factors were presented, and optimization results were verified by dynamic simulation and experiments. Finally, detail discussions of different load status and constraints were carried out to explore the leading factors affecting the optimization results. The contributions drawn from this article helpful for guiding the lightweight design of FMAV are summarizing as follow; first, a dynamic simulation modeling method used to obtain the load status is presented. Second, verification method of optimized results considering non-linear factors is introduced. Third, based on or can achieve a better weight reduction effect and improve the computational efficiency rather than taking multi-states into account. Fourth, basing on makes for improving the ability to resist bending deformation. Fifth, constraint of displacement helps to improve the structural stiffness of optimized result. Results and engineering guidance in this paper may shed lights on the structural optimization and light-weight design for future advanced FMAV.

Keywords: flapping-wing micro aerial vehicle, 3d printing, topology optimization, finite element analysis, experiment

Procedia PDF Downloads 156
26301 TELUM Land Use Model: An Investigation of Data Requirements and Calibration Results for Chittenden County MPO, U.S.A.

Authors: Georgia Pozoukidou

Abstract:

TELUM software is a land use model designed specifically to help metropolitan planning organizations (MPOs) prepare their transportation improvement programs and fulfill their numerous planning responsibilities. In this context obtaining, preparing, and validating socioeconomic forecasts are becoming fundamental tasks for an MPO in order to ensure that consistent population and employment data are provided to travel demand models. Chittenden County Metropolitan Planning Organization of Vermont State was used as a case study to test the applicability of TELUM land use model. The technical insights and lessons learned from the land use model application have transferable value for all MPOs faced with land use forecasting development and transportation modelling.

Keywords: calibration data requirements, land use models, land use planning, metropolitan planning organizations

Procedia PDF Downloads 277
26300 Learning and Rethinking Language through Gendered Experiences

Authors: Neha Narayanan

Abstract:

The paper tries to explore the role of language in determining spaces occupied by women in everyday lives. It is inspired from an ongoing action research work which employs ‘immersion’- arriving at a research problematic through community research, as a methodology in a Kondh adivasi village, Kirkalpadu located in Rayagada district of the Indian state of Odisha. In the dominant development discourse, language is associated with either preservation or conservation of endangered language or empowerment through language. Beyond these, is the discourse of language as a structure, with the hegemonic quality to organise lifeworld in a specific manner. This rigid structure leads to an experience of constriction of space for women. In Kirkalpadu, the action research work is with young and unmarried women of the age 15-25. During daytime, these women are either in the agricultural field or in the bari -the backyard of the house whose rooms are linearly arranged one after the other ending with the kitchen followed by an open space called bari (in Odia) which is an intimate and gendered space- where they are not easily visible. They justify the experience of restriction in mobility and fear of moving out of the village alone by the argument that the place and the men are nihi-aaeh (not good). These women, who have dropped out of school early to contribute to the (surplus) labour requirement in the household, want to learn English to be able to read signboards when they are on the road, to be able to fill forms at a bank and use mobile phones to communicate with their romantic partner(s). But the incapacity to have within one’s grasp the province of language and the incapacity to take the mobile phone to the kind of requirements marked by the above mentioned impossible transactions with space restricts them to the bari of the house. The paper concludes by seeking to explore the possibilities of learning and rethinking languages which takes into cognizance the gendered experience of women and the desire of women to cross the borders and occupy spaces restricted to them.

Keywords: action research, gendered experience, language, space

Procedia PDF Downloads 155
26299 Artificial Intelligence Approach to Water Treatment Processes: Case Study of Daspoort Treatment Plant, South Africa

Authors: Olumuyiwa Ojo, Masengo Ilunga

Abstract:

Artificial neural network (ANN) has broken the bounds of the convention programming, which is actually a function of garbage in garbage out by its ability to mimic the human brain. Its ability to adopt, adapt, adjust, evaluate, learn and recognize the relationship, behavior, and pattern of a series of data set administered to it, is tailored after the human reasoning and learning mechanism. Thus, the study aimed at modeling wastewater treatment process in order to accurately diagnose water control problems for effective treatment. For this study, a stage ANN model development and evaluation methodology were employed. The source data analysis stage involved a statistical analysis of the data used in modeling in the model development stage, candidate ANN architecture development and then evaluated using a historical data set. The model was developed using historical data obtained from Daspoort Wastewater Treatment plant South Africa. The resultant designed dimensions and model for wastewater treatment plant provided good results. Parameters considered were temperature, pH value, colour, turbidity, amount of solids and acidity. Others are total hardness, Ca hardness, Mg hardness, and chloride. This enables the ANN to handle and represent more complex problems that conventional programming is incapable of performing.

Keywords: ANN, artificial neural network, wastewater treatment, model, development

Procedia PDF Downloads 135
26298 Using Artificial Intelligence Method to Explore the Important Factors in the Reuse of Telecare by the Elderly

Authors: Jui-Chen Huang

Abstract:

This research used artificial intelligence method to explore elderly’s opinions on the reuse of telecare, its effect on their service quality, satisfaction and the relationship between customer perceived value and intention to reuse. This study conducted a questionnaire survey on the elderly. A total of 124 valid copies of a questionnaire were obtained. It adopted Backpropagation Network (BPN) to propose an effective and feasible analysis method, which is different from the traditional method. Two third of the total samples (82 samples) were taken as the training data, and the one third of the samples (42 samples) were taken as the testing data. The training and testing data RMSE (root mean square error) are 0.022 and 0.009 in the BPN, respectively. As shown, the errors are acceptable. On the other hand, the training and testing data RMSE are 0.100 and 0.099 in the regression model, respectively. In addition, the results showed the service quality has the greatest effects on the intention to reuse, followed by the satisfaction, and perceived value. This result of the Backpropagation Network method is better than the regression analysis. This result can be used as a reference for future research.

Keywords: artificial intelligence, backpropagation network (BPN), elderly, reuse, telecare

Procedia PDF Downloads 198
26297 Computer Server Virtualization

Authors: Pradeep M. C. Chand

Abstract:

Virtual infrastructure initiatives often spring from data center server consolidation projects, which focus on reducing existing infrastructure “box count”, retiring older hardware or life-extending legacy applications. Server consolidation benefits result from a reduction in the overall number of systems and related recurring costs (power, cooling, rack space, etc.) and also helps in the reduction of heat to the environment.

Keywords: server virtualization, data center, consolidation, project

Procedia PDF Downloads 513
26296 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 173
26295 Inclusion Advances of Disabled People in Higher Education: Possible Alignment with the Brazilian Statute of the Person with Disabilities

Authors: Maria Cristina Tommaso, Maria Das Graças L. Silva, Carlos Jose Pacheco

Abstract:

Have the advances of the Brazilian legislation reflected or have been consonant with the inclusion of PwD in higher education? In 1990 the World Declaration on Education for All, a document organized by the United Nations Educational, Scientific and Cultural Organization (UNESCO), stated that the basic learning needs of people with disabilities, as they were called, required special attention. Since then, legislation in signatory countries such as Brazil has made considerable progress in guaranteeing, in a gradual and increasing manner, the rights of persons with disabilities to education. Principles, policies, and practices of special educational needs were created and guided action at the regional, national and international levels on the structure of action in Special Education such as administration, recruitment of educators and community involvement. Brazilian Education Law No. 3.284 of 2003 ensures inclusion of people with disabilities in Brazilian higher education institutions and also in 2015 the Law 13,146/2015 - Brazilian Law on the Inclusion of Persons with Disabilities (Statute of the Person with Disabilities) regulates the inclusion of PwD by the guarantee of their rights. This study analyses data related to people with disability inclusion in High Education in the south region of Rio de Janeiro State - Brazil during the period between 2008 and 2018, based in its correlation with the changes in the Brazilian legislation in the last ten years that were subjected by PwD inclusion processes in the Brazilian High Education Systems. The region studied is composed by sixteen cities and this research refers to the largest one, Volta Redonda that represents 25 percent of the total regional population. The PwD reception process had the dicing data at the Volta Redonda University Center with 35 percent of high education students in this territorial area. The research methodology analyzed the changes occurring in the legislation about the inclusion of people with disability in High Education in the last ten years and its impacts on the samples of this study during the period between 2008 and 2018. It was verified an expressive increasing of the number of PwD students, from two in 2008 to 190 PwD students in 2018. The data conclusions are presented in quantitative terms and the aim of this study was to verify the effectiveness of the PwD inclusion in High Education, allowing visibility of this social group. This study verified that the fundamental human rights guarantees have a strong relation to the advances of legislation and the State as a guarantor instance of the rights of the people with disability and must be considered a mean of consolidation of their education opportunities isonomy. The recognition of full rights and the inclusion of people with disabilities requires the efforts of those who have decision-making power. This study aimed to demonstrate that legislative evolution is an effective instrument in the social integration of people with disabilities. The study confirms the fundamental role of the state in guaranteeing human rights and demonstrates that legislation not only protects the interests of vulnerable social groups, but can also, and this is perhaps its main mission, to change behavior patterns and provoke the social transformation necessary to the reduction of inequality of opportunity.

Keywords: high education, inclusion, legislation, people with disability

Procedia PDF Downloads 138
26294 Evaluation of the Heating Capability and in vitro Hemolysis of Nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) Ferrites Prepared by Sol-gel Method

Authors: Laura Elena De León Prado, Dora Alicia Cortés Hernández, Javier Sánchez

Abstract:

Among the different cancer treatments that are currently used, hyperthermia has a promising potential due to the multiple benefits that are obtained by this technique. In general terms, hyperthermia is a method that takes advantage of the sensitivity of cancer cells to heat, in order to damage or destroy them. Within the different ways of supplying heat to cancer cells and achieve their destruction or damage, the use of magnetic nanoparticles has attracted attention due to the capability of these particles to generate heat under the influence of an external magnetic field. In addition, these nanoparticles have a high surface area and sizes similar or even lower than biological entities, which allow their approaching and interaction with a specific region of interest. The most used magnetic nanoparticles for hyperthermia treatment are those based on iron oxides, mainly magnetite and maghemite, due to their biocompatibility, good magnetic properties and chemical stability. However, in order to fulfill more efficiently the requirements that demand the treatment of magnetic hyperthermia, there have been investigations using ferrites that incorporate different metallic ions, such as Mg, Mn, Co, Ca, Ni, Cu, Li, Gd, etc., in their structure. This paper reports the synthesis of nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) ferrites by sol-gel method and their evaluation in terms of heating capability and in vitro hemolysis to determine the potential use of these nanoparticles as thermoseeds for the treatment of cancer by magnetic hyperthermia. It was possible to obtain ferrites with nanometric sizes, a single crystalline phase with an inverse spinel structure and a behavior near to that of superparamagnetic materials. Additionally, at concentrations of 10 mg of magnetic material per mL of water, it was possible to reach a temperature of approximately 45°C, which is within the range of temperatures used for the treatment of hyperthermia. The results of the in vitro hemolysis assay showed that, at the concentrations tested, these nanoparticles are non-hemolytic, as their percentage of hemolysis is close to zero. Therefore, these materials can be used as thermoseeds for the treatment of cancer by magnetic hyperthermia.

Keywords: ferrites, heating capability, hemolysis, nanoparticles, sol-gel

Procedia PDF Downloads 325
26293 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques

Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian

Abstract:

Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.

Keywords: data mining, k-means, road traffic accidents, Waze, Weka

Procedia PDF Downloads 393
26292 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution

Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi

Abstract:

Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.

Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion

Procedia PDF Downloads 162
26291 A Comparison of Methods for Neural Network Aggregation

Authors: John Pomerat, Aviv Segev

Abstract:

Recently, deep learning has had many theoretical breakthroughs. For deep learning to be successful in the industry, however, there need to be practical algorithms capable of handling many real-world hiccups preventing the immediate application of a learning algorithm. Although AI promises to revolutionize the healthcare industry, getting access to patient data in order to train learning algorithms has not been easy. One proposed solution to this is data- sharing. In this paper, we propose an alternative protocol, based on multi-party computation, to train deep learning models while maintaining both the privacy and security of training data. We examine three methods of training neural networks in this way: Transfer learning, average ensemble learning, and series network learning. We compare these methods to the equivalent model obtained through data-sharing across two different experiments. Additionally, we address the security concerns of this protocol. While the motivating example is healthcare, our findings regarding multi-party computation of neural network training are purely theoretical and have use-cases outside the domain of healthcare.

Keywords: neural network aggregation, multi-party computation, transfer learning, average ensemble learning

Procedia PDF Downloads 144
26290 The Relationship between Political Risks and Capital Adequacy Ratio: Evidence from GCC Countries Using a Dynamic Panel Data Model (System–GMM)

Authors: Wesam Hamed

Abstract:

This paper contributes to the existing literature by investigating the impact of political risks on the capital adequacy ratio in the banking sector of Gulf Cooperation Council (GCC) countries, which is the first attempt for this nexus to the best of our knowledge. The dynamic panel data model (System‐GMM) showed that political risks significantly decrease the capital adequacy ratio in the banking sector. For this purpose, we used political risks, bank-specific, profitability, and macroeconomic variables that are utilized from the data stream database for the period 2005-2017. The results also actively support the “too big to fail” hypothesis. Finally, the robustness results confirm the conclusions derived from the baseline System‐GMM model.

Keywords: capital adequacy ratio, system GMM, GCC, political risks

Procedia PDF Downloads 126
26289 Enhancing Photocatalytic Hydrogen Production Modification of TiO₂ by coupling with CdS Nanoparticles

Authors: Saud Alshammari and, Xiaoan Mao

Abstract:

Photocatalytic water splitting to produce hydrogen (H₂) has obtained significant attention as an environmentally friendly technology. This process, which produces hydrogen from water and sunlight, represents a renewable energy source. Titanium dioxide (TiO₂) plays a critical role in photocatalytic hydrogen production due to its chemical stability, availability, and low cost. Nevertheless, TiO₂'s wide band gap (3.2 eV) limits its visible light absorption and might affect the effectiveness of the photocatalytic. Coupling TiO₂ with other semiconductors is a strategy that can enhance TiO₂ by narrowing its band gap and improving visible light absorption. This paper studies the modification of TiO₂ by coupling it with another semiconductor such as CdS nanoparticles using a reflux reactor and to form a core-shell structure. Characterization techniques, including TEM and UV-Vis spectroscopy, confirmed successful coating of TiO₂ on CdS core, reduction of the band gap from 3.28 eV to 3.1 eV, and enhanced light absorption in the visible region. These modifications are attributed to the heterojunction structure between TiO₂ and CdS.The essential goal of this study is to improve TiO₂ for use in photocatalytic water splitting to enhance hydrogen production. The core-shell TiO₂@CdS nanoparticles exhibited promising results, due to band gap narrowing and improved light absorption. Future work will involve adding Pt as a co-catalyst, which is known to increase surface reaction activity by enhancing proton adsorption. Evaluation of the TiO₂@CdS@Pt catalyst will include performance assessments and hydrogen productivity tests, considering factors such as effective shapes and material ratios. Optimizing photoreactor design for efficient photon delivery and illumination will further enhance the photocatalytic process. These strategies collectively aim to overcome current challenges and improve the efficiency of hydrogen production via photocatalysis.

Keywords: photocatalysis, water splitting, hydrogen production, heterojunction photocatalysts, band gap, nanoparticles

Procedia PDF Downloads 8
26288 Using ALOHA Code to Evaluate CO2 Concentration for Maanshan Nuclear Power Plant

Authors: W. S. Hsu, S. W. Chen, Y. T. Ku, Y. Chiang, J. R. Wang , J. H. Yang, C. Shih

Abstract:

ALOHA code was used to calculate the concentration under the CO2 storage burst condition for Maanshan nuclear power plant (NPP) in this study. Five main data are input into ALOHA code including location, building, chemical, atmospheric, and source data. The data from Final Safety Analysis Report (FSAR) and some reports were used in this study. The ALOHA results are compared with the failure criteria of R.G. 1.78 to confirm the habitability of control room. The result of comparison presents that the ALOHA result is below the R.G. 1.78 criteria. This implies that the habitability of control room can be maintained in this case. The sensitivity study for atmospheric parameters was performed in this study. The results show that the wind speed has the larger effect in the concentration calculation.

Keywords: PWR, ALOHA, habitability, Maanshan

Procedia PDF Downloads 184
26287 Molecular Engineering of Intrinsically Microporous Polybenzimidazole for Energy-efficient Gas Separation

Authors: Mahmoud Abdulhamid, Rifan Hardian, Prashant Bhatt, Shuvo Datta, Adrian Ramirez, Jorge Gascon, Mohamed Eddaoudi, Gyorgy Szekely

Abstract:

Polybenzimidazole (PBI) is a high-performance polymer that exhibits high thermal and chemical stability. However, it suffers from low porosity and low fractional free volume, which hinder its application as separation material. Herein, we demonstrate the molecular engineering of gas separation materials by manipulating a PBI backbone possessing kinked moieties. PBI was selected as it contains NH groups which increase the affinity towards CO₂, increase sorption capacity, and favors CO₂ over other gasses. We have designed and synthesized an intrinsically microporous polybenzimidazole (iPBI) featuring a spirobisindane structure. Introducing a kinked moiety in conjunction with crosslinking enhanced the polymer properties, markedly increasing the gas separation performance. In particular, the BET surface area of PBI increased 30-fold by replacing a flat benzene ring with a kinked structure. iPBI displayed a good CO₂ uptake of 1.4 mmol g⁻¹ at 1 bar and 3.6 mmol g⁻¹ at 10 bar. Gas sorption uptake and breakthrough experiments were conducted using mixtures of CO₂/CH₄ (50%/50%) and CO₂/N₂ (50%/50%), which revealed the high selectivity of CO₂ over both CH₄ and N₂. The obtained CO₂/N₂ selectivity is attractive for power plant flue gas application requiring CO₂ capturing materials. Energy and process simulations of biogas CO₂ removal demonstrated that up to 70% of the capture energy could be saved when iPBI was used rather than the current amine technology (methyl diethanolamine [MDEA]). Similarly, the combination of iPBI and MDEA in a hybrid system exhibited the highest CO₂ capture yield (99%), resulting in nearly 50% energy saving. The concept of enhancing the porosity of PBI using kinked moieties provides new scope for designing highly porous polybenzimidazoles for various separation processes.

Keywords: polybenzimidazole (PBI), intrinsically microporous polybenzimidazole (iPBI), gas separation, pnergy and process simulations

Procedia PDF Downloads 69
26286 Estimation of Foliar Nitrogen in Selected Vegetation Communities of Uttrakhand Himalayas Using Hyperspectral Satellite Remote Sensing

Authors: Yogita Mishra, Arijit Roy, Dhruval Bhavsar

Abstract:

The study estimates the nitrogen concentration in selected vegetation community’s i.e. chir pine (pinusroxburghii) by using hyperspectral satellite data and also identified the appropriate spectral bands and nitrogen indices. The Short Wave InfraRed reflectance spectrum at 1790 nm and 1680 nm shows the maximum possible absorption by nitrogen in selected species. Among the nitrogen indices, log normalized nitrogen index performed positively and negatively too. The strong positive correlation is taken out from 1510 nm and 760 nm for the pinusroxburghii for leaf nitrogen concentration and leaf nitrogen mass while using NDNI. The regression value of R² developed by using linear equation achieved maximum at 0.7525 for the analysis of satellite image data and R² is maximum at 0.547 for ground truth data for pinusroxburghii respectively.

Keywords: hyperspectral, NDNI, nitrogen concentration, regression value

Procedia PDF Downloads 279
26285 Functional and Efficient Query Interpreters: Principle, Application and Performances’ Comparison

Authors: Laurent Thiry, Michel Hassenforder

Abstract:

This paper presents a general approach to implement efficient queries’ interpreters in a functional programming language. Indeed, most of the standard tools actually available use an imperative and/or object-oriented language for the implementation (e.g. Java for Jena-Fuseki) but other paradigms are possible with, maybe, better performances. To proceed, the paper first explains how to model data structures and queries in a functional point of view. Then, it proposes a general methodology to get performances (i.e. number of computation steps to answer a query) then it explains how to integrate some optimization techniques (short-cut fusion and, more important, data transformations). It then compares the functional server proposed to a standard tool (Fuseki) demonstrating that the first one can be twice to ten times faster to answer queries.

Keywords: data transformation, functional programming, information server, optimization

Procedia PDF Downloads 145
26284 Dimension Free Rigid Point Set Registration in Linear Time

Authors: Jianqin Qu

Abstract:

This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.

Keywords: covariant point, point matching, dimension free, rigid registration

Procedia PDF Downloads 157
26283 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 318
26282 Chromatographic Preparation and Performance on Zinc Ion Imprinted Monolithic Column and Its Adsorption Property

Authors: X. Han, S. Duan, C. Liu, C. Zhou, W. Zhu, L. Kong

Abstract:

The ionic imprinting technique refers to the three-dimensional rigid structure with the fixed pore sizes, which was formed by the binding interactions of ions and functional monomers and used ions as the template, it has a high level of recognition to the ionic template. The preparation of monolithic column by the in-situ polymerization need to put the compound of template, functional monomers, cross-linking agent and initiating agent into the solution, dissolve it and inject to the column tube, and then the compound will have a polymerization reaction at a certain temperature, after the synthetic reaction, we washed out the unread template and solution. The monolithic columns are easy to prepare, low consumption and cost-effective with fast mass transfer, besides, they have many chemical functions. But the monolithic columns have some problems in the practical application, such as low-efficiency, quantitative analysis cannot be performed accurately because of the peak shape is wide and has tailing phenomena; the choice of polymerization systems is limited and the lack of theoretical foundations. Thus the optimization of components and preparation methods is an important research direction. During the preparation of ionic imprinted monolithic columns, pore-forming agent can make the polymer generate the porous structure, which can influence the physical properties of polymer, what’ s more, it can directly decide the stability and selectivity of polymerization reaction. The compounds generated in the pre-polymerization reaction could directly decide the identification and screening capabilities of imprinted polymer; thus the choice of pore-forming agent is quite critical in the preparation of imprinted monolithic columns. This article mainly focuses on the research that when using different pore-forming agents, the impact of zinc ion imprinted monolithic column on the enrichment performance of zinc ion.

Keywords: high performance liquid chromatography (HPLC), ionic imprinting, monolithic column, pore-forming agent

Procedia PDF Downloads 201
26281 A Crowdsourced Homeless Data Collection System and Its Econometric Analysis

Authors: Praniil Nagaraj

Abstract:

This paper proposes a method to collect homeless data using crowdsourcing and presents an approach to analyze the data, demonstrating its potential to strengthen existing and future policies aimed at promoting socio-economic equilibrium. The 2022 Annual Homeless Assessment Report (AHAR) to Congress highlighted alarming statistics, emphasizing the need for effective decision-making and budget allocation within local planning bodies known as Continuums of Care (CoC). This paper's contributions can be categorized into three main areas. Firstly, a unique method for collecting homeless data is introduced, utilizing a user-friendly smartphone app (currently available for Android). The app enables the general public to quickly record information about homeless individuals, including the number of people and details about their living conditions. The collected data, including date, time, and location, is anonymized and securely transmitted to the cloud. It is anticipated that an increasing number of users motivated to contribute to society will adopt the app, thus expanding the data collection efforts. Duplicate data is addressed through simple classification methods, and historical data is utilized to fill in missing information. The second contribution of this paper is the description of data analysis techniques applied to the collected data. By combining this new data with existing information, statistical regression analysis is employed to gain insights into various aspects, such as distinguishing between unsheltered and sheltered homeless populations, as well as examining their correlation with factors like unemployment rates, housing affordability, and labor demand. Initial data is collected in San Francisco, while pre-existing information is drawn from three cities: San Francisco, New York City, and Washington D.C., facilitating the conduction of simulations. The third contribution focuses on demonstrating the practical implications of the data processing results. The challenges faced by key stakeholders, including charitable organizations and local city governments, are taken into consideration. Two case studies are presented as examples. The first case study explores improving the efficiency of food and necessities distribution, as well as medical assistance, driven by charitable organizations. The second case study examines the correlation between micro-geographic budget expenditure by local city governments and homeless information to justify budget allocation and expenditures. The ultimate objective of this endeavor is to enable the continuous enhancement of the quality of life for the underprivileged. It is hoped that through increased crowdsourcing of data from the public, the Generosity Curve and the Need Curve will intersect, leading to a better world for all.

Keywords: crowdsourcing, homelessness, socio-economic policies, statistical analysis

Procedia PDF Downloads 43
26280 Application of KL Divergence for Estimation of Each Metabolic Pathway Genes

Authors: Shohei Maruyama, Yasuo Matsuyama, Sachiyo Aburatani

Abstract:

The development of the method to annotate unknown gene functions is an important task in bioinformatics. One of the approaches for the annotation is The identification of the metabolic pathway that genes are involved in. Gene expression data have been utilized for the identification, since gene expression data reflect various intracellular phenomena. However, it has been difficult to estimate the gene function with high accuracy. It is considered that the low accuracy of the estimation is caused by the difficulty of accurately measuring a gene expression. Even though they are measured under the same condition, the gene expressions will vary usually. In this study, we proposed a feature extraction method focusing on the variability of gene expressions to estimate the genes' metabolic pathway accurately. First, we estimated the distribution of each gene expression from replicate data. Next, we calculated the similarity between all gene pairs by KL divergence, which is a method for calculating the similarity between distributions. Finally, we utilized the similarity vectors as feature vectors and trained the multiclass SVM for identifying the genes' metabolic pathway. To evaluate our developed method, we applied the method to budding yeast and trained the multiclass SVM for identifying the seven metabolic pathways. As a result, the accuracy that calculated by our developed method was higher than the one that calculated from the raw gene expression data. Thus, our developed method combined with KL divergence is useful for identifying the genes' metabolic pathway.

Keywords: metabolic pathways, gene expression data, microarray, Kullback–Leibler divergence, KL divergence, support vector machines, SVM, machine learning

Procedia PDF Downloads 386
26279 Impact of Instagram Food Bloggers on Consumer (Generation Z) Decision Making Process in Islamabad. Pakistan

Authors: Tabinda Sadiq, Tehmina Ashfaq Qazi, Hoor Shumail

Abstract:

Recently, the advent of emerging technology has created an emerging generation of restaurant marketing. It explores the aspects that influence customers’ decision-making process in selecting a restaurant after reading food bloggers' reviews online. The motivation behind this research is to investigate the correlation between the credibility of the source and their attitude toward restaurant visits. The researcher collected the data by distributing a survey questionnaire through google forms by employing the Source credibility theory. Non- probability purposive sampling technique was used to collect data. The questionnaire used a predeveloped and validated scale by Ohanian to measure the relationship. Also, the researcher collected data from 250 respondents in order to investigate the influence of food bloggers on Gen Z's decision-making process. SPSS statistical version 26 was used for statistical testing and analyzing the data. The findings of the survey revealed that there is a moderate positive correlation between the variables. So, it can be analyzed that food bloggers do have an impact on Generation Z's decision making process.

Keywords: credibility, decision making, food bloggers, generation z, e-wom

Procedia PDF Downloads 58
26278 Performance Measurement of Logistics Systems for Thailand's Wholesales and Retails Industries by Data Envelopment Analysis

Authors: Pornpimol Chaiwuttisak

Abstract:

The study aims to compare the performance of the logistics for Thailand’s wholesale and retail trade industries (except motor vehicles, motorcycle, and stalls) by using data (data envelopment analysis). Thailand Standard Industrial Classification in 2009 (TSIC - 2009) categories that industries into sub-group no. 45: wholesale and retail trade (except for the repair of motor vehicles and motorcycles), sub-group no. 46: wholesale trade (except motor vehicles and motorcycles), and sub-group no. 47: retail trade (except motor vehicles and motorcycles. Data used in the study is collected by the National Statistical Office, Thailand. The study consisted of four input factors include the number of companies, the number of personnel in logistics, the training cost in logistics, and outsourcing logistics management. Output factor includes the percentage of enterprises having inventory management. The results showed that the average relative efficiency of small-sized enterprises equals to 27.87 percent and 49.68 percent for the medium-sized enterprises.

Keywords: DEA, wholesales and retails, logistics, Thailand

Procedia PDF Downloads 403
26277 Event Data Representation Based on Time Stamp for Pedestrian Detection

Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita

Abstract:

In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.

Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption

Procedia PDF Downloads 80