Search results for: Privacy and Data Protection Law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26332

Search results for: Privacy and Data Protection Law

23092 A Natural Killer T Cell Subset That Protects against Airway Hyperreactivity

Authors: Ya-Ting Chuang, Krystle Leung, Ya-Jen Chang, Rosemarie H. DeKruyff, Paul B. Savage, Richard Cruse, Christophe Benoit, Dirk Elewaut, Nicole Baumgarth, Dale T. Umetsu

Abstract:

We examined characteristics of a Natural Killer T (NKT) cell subpopulation that developed during influenza infection in neonatal mice, and that suppressed the subsequent development of allergic asthma in a mouse model. This NKT cell subset expressed CD38 but not CD4, produced IFN-γ, but not IL-17, IL-4 or IL-13, and inhibited the development of airway hyperreactivity (AHR) through contact-dependent suppressive activity against helper CD4 T cells. The NKT subset expanded in the lungs of neonatal mice after infection with influenza, but also after treatment of neonatal mice with a Th1-biasing α-GalCer glycolipid analogue, Nu-α-GalCer. These results suggest that early/neonatal exposure to infection or to antigenic challenge can affect subsequent lung immunity by altering the profile of cells residing in the lung and that some subsets of NKT cells can have direct inhibitory activity against CD4+ T cells in allergic asthma. Importantly, our results also suggest a potential therapy for young children that might provide protection against the development of asthma.

Keywords: NKT subset, asthma, airway hyperreactivity, hygiene hypothesis, influenza

Procedia PDF Downloads 222
23091 Theoretical, Numerical and Experimental Assessment of Elastomeric Bearing Stability

Authors: Manuel A. Guzman, Davide Forcellini, Ricardo Moreno, Diego H. Giraldo

Abstract:

Elastomeric bearings (EB) are used in many applications, such as base isolation of bridges, seismic protection and vibration control of other structures and machinery. Their versatility is due to their particular behavior since they have different stiffness in the vertical and horizontal directions, allowing to sustain vertical loads and at the same time horizontal displacements. Therefore, vertical, horizontal and bending stiffnesses are important parameters to take into account in the design of EB. In order to acquire a proper design methodology of EB all three, theoretical, finite element analysis and experimental, approaches should be taken into account to assess stability due to different loading states, predict their behavior and consequently their effects on the dynamic response of structures, and understand complex behavior and properties of rubber-like materials respectively. In particular, the recent large-displacement theory on the stability of EB formulated by Forcellini and Kelly is validated with both numerical simulations using the finite element method, and experimental results set at the University of Antioquia in Medellin, Colombia. In this regard, this study reproduces the behavior of EB under compression loads and investigates the stability behavior with the three mentioned points of view.

Keywords: elastomeric bearings, experimental tests, numerical simulations, stability, large-displacement theory

Procedia PDF Downloads 443
23090 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series

Procedia PDF Downloads 125
23089 New Vision of 'Social Europe': Renationalising the Integration Process in the Internal Market of the European Union

Authors: Robert Grzeszczak, Magdalena Gniadzik

Abstract:

The article deals with one of the most significant issues concerning the functioning of the internal market of the European Union – the free movement of workers and free movement of persons. The purpose is to identify the political and legal effects of the “renationalisation process” on the EU and its Member States. The concept of renationalisation is expressed through Member States’ aim to verify the relationship with the EU. The tendency is more visible in the public opinion of several MS’s of the ‘EU core’ and may be confirmed by the changes applied by the regulatory body. The thesis for the article is the return of renationalisation tendencies in the area of the Single Market, which is supported by, among others, an open criticism of the foundations of EU integration or considerations on withdrawal from the EU by some MS. This analysis will focus primarily on the effects that renationalisation may have on the free movement of persons. The free movement of persons is one of the key issues for the development of the European integration. It is still subject to theoretical reflections, new doubts and practical issues. The latest developments in politics, law and jurisprudence demonstrate the need to reflect on the attempts to redefine certain principles regarding migrant EU workers and their protection against nationality-based discrimination.

Keywords: European Union, Singel Market, free movement of persons, posting of workers

Procedia PDF Downloads 218
23088 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics

Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee

Abstract:

Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.

Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru

Procedia PDF Downloads 59
23087 Neuro-Connectivity Analysis Using Abide Data in Autism Study

Authors: Dulal Bhaumik, Fei Jie, Runa Bhaumik, Bikas Sinha

Abstract:

Human brain is an amazingly complex network. Aberrant activities in this network can lead to various neurological disorders such as multiple sclerosis, Parkinson’s disease, Alzheimer’s disease and autism. fMRI has emerged as an important tool to delineate the neural networks affected by such diseases, particularly autism. In this paper, we propose mixed-effects models together with an appropriate procedure for controlling false discoveries to detect disrupted connectivities in whole brain studies. Results are illustrated with a large data set known as Autism Brain Imaging Data Exchange or ABIDE which includes 361 subjects from 8 medical centers. We believe that our findings have addressed adequately the small sample inference problem, and thus are more reliable for therapeutic target for intervention. In addition, our result can be used for early detection of subjects who are at high risk of developing neurological disorders.

Keywords: ABIDE, autism spectrum disorder, fMRI, mixed-effects model

Procedia PDF Downloads 268
23086 From Vertigo to Verticality: An Example of Phenomenological Design in Architecture

Authors: E. Osorio Schmied

Abstract:

Architects commonly attempt a depiction of organic forms when their works are inspired by nature, regardless of the building site. Nevertheless it is also possible to try matching structures with natural scenery, by applying a phenomenological approach in terms of spatial operations, regarding perceptions from nature through architectural aspects such as protection, views, and orientation. This method acknowledges a relationship between place and space, where intentions towards tangible facts then become design statements. Although spaces resulting from such a process may present an effective response to the environment, they can also offer further outcomes beyond the realm of form. The hypothesis is that, in addition to recognising a bond between architecture and nature, it is also plausible to associate such perceptions with the inner ambient of buildings, by analysing features such as daylight. The case study of a single-family house in a rainforest near Valdivia, Chilean Patagonia is presented, with the intention of addressing the above notions through a discussion of the actual effects of inhabiting a place by way of a series of insights, including a revision of diagrams and photographs that assist in understanding the implications of this design practice. In addition, figures based on post-occupancy behaviour and daylighting performance relate both architectural and environmental issues to a decision-making process motivated by the observation of nature.

Keywords: architecture, design statements, nature, perception

Procedia PDF Downloads 327
23085 Environment Situation Analysis of Germany

Authors: K. Y. Chen, H. Chua, C. W. Kan

Abstract:

In this study, we will analyze Germany’s environmental situation such as water and air quality and review its environmental policy. In addition, we will collect the yearly environmental data as well as information concerning public environmental investment. Based on the data collect, we try to find out the relationship between public environmental investment and sustainable development in Germany. In addition, after comparing the trend of environmental quality and situation of environmental policy and investment, we may have some conclusions and learnable aspects to refer to. Based upon the data collected, it was revealed that Germany has established a well-developed institutionalization of environmental education. And the ecological culture at school is dynamic and continuous renewal. The booming of green markets in Germany is a very successful experience for learning. The green market not only creates a number of job opportunities, but also helps the government to improve and protect the environment. Acknowledgement: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: Germany, public environmental investment, environment quality, sustainable development

Procedia PDF Downloads 235
23084 Optimal Allocation of Multiple Emergency Resources for a Single Potential Accident Node: A Mixed Integer Linear Program

Authors: Yongjian Du, Jinhua Sun, Kim M. Liew, Huahua Xiao

Abstract:

Optimal allocation of emergency resources before a disaster is of great importance for emergency response. In reality, the pre-protection for a single critical node where accidents may occur is common. In this study, a model is developed to determine location and inventory decisions of multiple emergency resources among a set of candidate stations to minimize the total cost based on the constraints of budgetary and capacity. The total cost includes the economic accident loss which is accorded with probability distribution of time and the warehousing cost of resources which is increasing over time. A ratio is set to measure the degree of a storage station only serving the target node that becomes larger with the decrease of the distance between them. For the application of linear program, it is assumed that the length of travel time to the accident scene of emergency resources has a linear relationship with the economic accident loss. A computational experiment is conducted to illustrate how the proposed model works, and the results indicate its effectiveness and practicability.

Keywords: emergency response, integer linear program, multiple emergency resources, pre-allocation decisions, single potential accident node

Procedia PDF Downloads 139
23083 Textile Based Physical Wearable Sensors for Healthcare Monitoring in Medical and Protective Garments

Authors: Sejuti Malakar

Abstract:

Textile sensors have gained a lot of interest in recent years as it is instrumental in monitoring physiological and environmental changes, for a better diagnosis that can be useful in various fields like medical textiles, sports textiles, protective textiles, agro textiles, and geo-textiles. Moreover, with the development of flexible textile-based wearable sensors, the functionality of smart clothing is augmented for a more improved user experience when it comes to technical textiles. In this context, conductive textiles using new composites and nanomaterials are being developed while considering its compatibility with the textile manufacturing processes. This review aims to provide a comprehensive and detailed overview of the contemporary advancements in textile-based wearable physical sensors, used in the field of medical, security, surveillance, and protection, from a global perspective. The methodology used is through analysing various examples of integration of wearable textile-based sensors with clothing for daily use, keeping in mind the technological advances in the same. By comparing various case studies, we come across various challenges textile sensors, in terms of stability, the comfort of movement, and reliable sensing components to enable accurate measurements, in spite of progress in the engineering of the wearable. Addressing such concerns is critical for the future success of wearable sensors.

Keywords: flexible textile-based wearable sensors, contemporary advancements, conductive textiles, body conformal design

Procedia PDF Downloads 164
23082 A Prediction Model of Adopting IPTV

Authors: Jeonghwan Jeon

Abstract:

With the advent of IPTV in the fierce competition with existing broadcasting system, it is emerged as an important issue to predict how much the adoption of IPTV service will be. This paper aims to suggest a prediction model for adopting IPTV using classification and Ranking Belief Simplex (CaRBS). A simplex plot method of representing data allows a clear visual representation to the degree of interaction of the support from the variables to the prediction of the objects. CaRBS is applied to the survey data on the IPTV adoption.

Keywords: prediction, adoption, IPTV, CaRBS

Procedia PDF Downloads 398
23081 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 93
23080 Grammatically Coded Corpus of Spoken Lithuanian: Methodology and Development

Authors: L. Kamandulytė-Merfeldienė

Abstract:

The paper deals with the main issues of methodology of the Corpus of Spoken Lithuanian which was started to be developed in 2006. At present, the corpus consists of 300,000 grammatically annotated word forms. The creation of the corpus consists of three main stages: collecting the data, the transcription of the recorded data, and the grammatical annotation. Collecting the data was based on the principles of balance and naturality. The recorded speech was transcribed according to the CHAT requirements of CHILDES. The transcripts were double-checked and annotated grammatically using CHILDES. The development of the Corpus of Spoken Lithuanian has led to the constant increase in studies on spontaneous communication, and various papers have dealt with a distribution of parts of speech, use of different grammatical forms, variation of inflectional paradigms, distribution of fillers, syntactic functions of adjectives, the mean length of utterances.

Keywords: CHILDES, corpus of spoken Lithuanian, grammatical annotation, grammatical disambiguation, lexicon, Lithuanian

Procedia PDF Downloads 222
23079 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 190
23078 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 254
23077 Carotenoids a Biologically Important Bioactive Compound

Authors: Aarti Singh, Anees Ahmad

Abstract:

Carotenoids comprise a group of isoprenoid pigments. Carotenes, xanthophylls and their derivatives have been found to play an important role in all living beings through foods, neutraceuticals and pharmaceuticals. α-carotene, β-carotene and β-cryptoxanthin play a vital role in humans to provide vitamin A source for the growth, development and proper functioning of immune system and vision. They are very crucial for plants and humans as they protect from photooxidative damage and are excellent antioxidants quenching singlet molecular oxygen and peroxyl radicals. Diet including more intake of carotenoids results in reduced threat of various chronic diseases such as cancer (lung, breast, prostrate, colorectal and ovarian cancers) and coronary heart diseases. The blue light filtering efficiency of the carotenoids in liposomes have been reported to be maximum in lutein followed by zeaxanthin, β-carotene and lycopene. Lycopene plays a vital role for the protection from CVD. Lycopene in serum is directly related to reduced risk of osteoporosis in postmenopausal women. Carotenoids have major role in the treatment of skin disorders. There is need to identify and isolate novel carotenoids from diverse natural sources for human health benefits.

Keywords: antioxidants, carotenoids, neutraceuticals, osteoporosis, pharmaceuticals

Procedia PDF Downloads 365
23076 Cadmium Separation from Aqueous Solutions by Natural Biosorbents

Authors: Z. V. P. Murthy, Preeti Arunachalam, Sangeeta Balram

Abstract:

Removal of metal ions from different wastewaters has become important due to their effects on living beings. Cadmium is one of the heavy metals found in different industrial wastewaters. There are many conventional methods available to remove heavy metals from wastewaters like adsorption, membrane separations, precipitation, electrolytic methods, etc. and all of them have their own advantages and disadvantages. The present work deals with the use of natural biosorbents (chitin and chitosan) to separate cadmium ions from aqueous solutions. The adsorption data were fitted with different isotherms and kinetics models. Amongst different adsorption isotherms used to fit the adsorption data, the Freundlich isotherm showed better fits for both the biosorbents. The kinetics data of adsorption of cadmium showed better fit with pseudo-second order model for both the biosorbents. Chitosan, the derivative from chitin, showed better performance than chitin. The separation results are encouraging.

Keywords: chitin, chitosan, cadmium, isotherm, kinetics

Procedia PDF Downloads 397
23075 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar

Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna

Abstract:

This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.

Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation

Procedia PDF Downloads 226
23074 Prevention of Student Radicalism in School through Civic Education

Authors: Triyanto

Abstract:

Radicalism poses a real threat to Indonesia's future. The target of radicalism is the youth of Indonesia. This is proven by the majority of terrorists are young people. Radicalization is not only a repressive act but also requires educational action. One of the educational efforts is civic education. This study discusses the prevention of radicalism for students through civic education and its constraints. This is qualitative research. Data were collected through literature studies, observations and in-depth interviews. Data were validated by triangulation. The sample of this research is 30 high school students in Surakarta. Data were analyzed by the interactive model of analysis from Miles & Huberman. The results show that (1) civic education can be a way of preventing student radicalism in schools in the form of cultivating the values of education through learning in the classroom and outside the classroom; (2) The obstacles encountered include the lack of learning facilities, the limited ability of teachers and the low attention of students to the civic education.

Keywords: prevention, radicalism, senior high school student, civic education

Procedia PDF Downloads 216
23073 Two-Channels Thermal Energy Storage Tank: Experiments and Short-Cut Modelling

Authors: M. Capocelli, A. Caputo, M. De Falco, D. Mazzei, V. Piemonte

Abstract:

This paper presents the experimental results and the related modeling of a thermal energy storage (TES) facility, ideated and realized by ENEA and realizing the thermocline with an innovative geometry. Firstly, the thermal energy exchange model of an equivalent shell & tube heat exchanger is described and tested to reproduce the performance of the spiral exchanger installed in the TES. Through the regression of the experimental data, a first-order thermocline model was also validated to provide an analytical function of the thermocline, useful for the performance evaluation and the comparison with other systems and implementation in simulations of integrated systems (e.g. power plants). The experimental data obtained from the plant start-up and the short-cut modeling of the system can be useful for the process analysis, for the scale-up of the thermal storage system and to investigate the feasibility of its implementation in actual case-studies.

Keywords: CSP plants, thermal energy storage, thermocline, mathematical modelling, experimental data

Procedia PDF Downloads 315
23072 Recovery of Waste Acrylic Fibers for the Elimination of Basic Dyes

Authors: N. Ouslimani, M. T. Abadlia

Abstract:

Environment protection is a precondition for sustained growth and a better quality of life for all people on earth. Aqueous industrial effluents are the main sources of pollution. Among the compounds of these effluents, dyes are particularly resistant to discoloration by conventional methods, and discharges present many problems that must be supported. The scientific literature shows that synthetic organic dyes are compounds used in many industrial sectors. They are found in the chemical, car, paper industry and particularly the textile industry, where all the lines and grades of the chemical family are represented. The affinity between the fibers and dyes vary depending on the chemical structure of dyes and the type of materials to which they are applied. It is not uncommon to find that during the dyeing operation from 15 to 20 % of sulfur dyes, and sometimes up to 40 % of the reactants are discharged with the effluent. This study was conducted for the purpose of fading basics dyes from wastewater using as adsorbent fiber waste material. This technique presents an interesting alternative to usual treatment, as it allows the recovery of waste fibers, which can find uses as raw material for the manufacture of cleaning products or in other sectors In this study the results obtained by fading fiber waste are encouraging, given the rate of color removal which is about 90%.This method also helps to decrease BOD and suspended solids MES in an effective way.

Keywords: adsorption, dyes, fiber, valorization, wastewater

Procedia PDF Downloads 272
23071 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images

Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge

Abstract:

Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.

Keywords: band selection, fuzzy c-means, k-means, hyperspectral image

Procedia PDF Downloads 386
23070 Development of a Remote Testing System for Performance of Gas Leakage Detectors

Authors: Gyoutae Park, Woosuk Kim, Sangguk Ahn, Seungmo Kim, Minjun Kim, Jinhan Lee, Youngdo Jo, Jongsam Moon, Hiesik Kim

Abstract:

In this research, we designed a remote system to test parameters of gas detectors such as gas concentration and initial response time. This testing system is available to measure two gas instruments simultaneously. First of all, we assembled an experimental jig with a square structure. Those parts are included with a glass flask, two high-quality cameras, and two Ethernet modems for transmitting data. This remote gas detector testing system extracts numerals from videos with continually various gas concentrations while LCDs show photographs from cameras. Extracted numeral data are received to a laptop computer through Ethernet modem. And then, the numerical data with gas concentrations and the measured initial response speeds are recorded and graphed. Our remote testing system will be diversely applied on gas detector’s test and will be certificated in domestic and international countries.

Keywords: gas leak detector, inspection instrument, extracting numerals, concentration

Procedia PDF Downloads 365
23069 Assessment of Indoor Air Pollution in Naturally Ventilated Dwellings of Mega-City Kolkata

Authors: Tanya Kaur Bedi, Shankha Pratim Bhattacharya

Abstract:

The US Environmental Protection Agency defines indoor air pollution as “The air quality within and around buildings, especially as it relates to the health and comfort of building occupants”. According to the 2021 report by the Energy Policy Institute at Chicago, Indian residents, a country which is home to the highest levels of air pollution in the world, lose about 5.9 years from life expectancy due to poor air quality and yet has numerous dwellings dependent on natural ventilation. Currently the urban population spends 90% of the time indoors, this scenario raises a concern for occupant health and well-being. This study attempts to demonstrate the causal relationship between the indoor air pollution and its determining aspects. Detailed indoor air pollution audits were conducted in residential buildings located in Kolkata, India in the months of December and January 2021. According to the air pollution knowledge assessment city program in India, Kolkata is also the second most polluted mega-city after Delhi. Although the air pollution levels are alarming year-long, the winter months are most crucial due to the unfavourable environmental conditions. While emissions remain typically constant throughout the year, cold air is denser and moves slower than warm air, trapping the pollution in place for much longer and consequently is breathed in at a higher rate than the summers. The air pollution monitoring period was selected considering environmental factors and major pollution contributors like traffic and road dust. This study focuses on the relationship between the built environment and the spatial-temporal distribution of air pollutants in and around it. The measured parameters include, temperature, relative humidity, air velocity, particulate matter, volatile organic compounds, formaldehyde, and benzene. A total of 56 rooms were audited, selectively targeting the most dominant middle-income group in the urban area of the metropolitan. The data-collection was conducted using a set of instruments positioned in the human breathing-zone. The study assesses the relationship between indoor air pollution levels and factors determining natural ventilation and air pollution dispersion such as surrounding environment, dominant wind, openable window to floor area ratio, windward or leeward side openings, and natural ventilation type in the room: single side or cross-ventilation, floor height, residents cleaning habits, etc.

Keywords: indoor air quality, occupant health, air pollution, architecture, urban environment

Procedia PDF Downloads 92
23068 The Galactic Magnetic Field in the Light of Starburst-Generated Ultrahigh-Energy Cosmic Rays

Authors: Luis A. Anchordoqui, Jorge F. Soriano, Diego F. Torres

Abstract:

Auger data show evidence for a correlation between ultrahigh-energy cosmic rays (UHECRs) and nearby starburst galaxies. This intriguing correlation is consistent with data collected by the Telescope Array, which have revealed a much more pronounced directional 'hot spot' in arrival directions not far from the starburst galaxy M82. In this work, we assume starbursts are sources of UHECRs, and we investigate the prospects to use the observed distribution of UHECR arrival directions to constrain galactic magnetic field models. We show that if the Telescope Array hot spot indeed originates on M82, UHECR data would place a strong constraint on the turbulent component of the galactic magnetic field.

Keywords: galactic magnetic field, Pierre Auger observatory, telescope array, ultra-high energy cosmic rays

Procedia PDF Downloads 135
23067 Biostabilisation of Sediments for the Protection of Marine Infrastructure from Scour

Authors: Rob Schindler

Abstract:

Industry-standard methods of mitigating erosion of seabed sediments rely on ‘hard engineering’ approaches which have numerous environmental shortcomings: (1) direct loss of habitat by smothering of benthic species, (2) disruption of sediment transport processes, damaging geomorphic and ecosystem functionality (3) generation of secondary erosion problems, (4) introduction of material that may propagate non-local species, and (5) provision of pathways for the spread of invasive species. Recent studies have also revealed the importance of biological cohesion, the result of naturally occurring extra-cellular polymeric substances (EPS), in stabilizing natural sediments. Mimicking the strong bonding kinetics through the deliberate addition of EPS to sediments – henceforth termed ‘biostabilisation’ - offers a means in which to mitigate against erosion induced by structures or episodic increases in hydrodynamic forcing (e.g. storms and floods) whilst avoiding, or reducing, hard engineering. Here we present unique experiments that systematically examine how biostabilisation reduces scour around a monopile in a current, a first step to realizing the potential of this new method of scouring reduction for a wide range of engineering purposes in aquatic substrates. Experiments were performed in Plymouth University’s recirculating sediment flume which includes a recessed scour pit. The model monopile was 0.048 m in diameter, D. Assuming a prototype monopile diameter of 2.0 m yields a geometric ratio of 41.67. When applied to a 10 m prototype water depth this yields a model depth, d, of 0.24 m. The sediment pit containing the monopile was filled with different biostabilised substrata prepared using a mixture of fine sand (D50 = 230 μm) and EPS (Xanthan gum). Nine sand-EPS mixtures were examined spanning EPS contents of 0.0% < b0 < 0.50%. Scour development was measured using a laser point gauge along a 530 mm centreline at 10 mm increments at regular periods over 5 h. Maximum scour depth and excavated area were determined at different time steps and plotted against time to yield equilibrium values. After 5 hours the current was stopped and a detailed scan of the final scour morphology was taken. Results show that increasing EPS content causes a progressive reduction in the equilibrium depth and lateral extent of scour, and hence excavated material. Very small amounts equating to natural communities (< 0.1% by mass) reduce scour rate, depth and extent of scour around monopiles. Furthermore, the strong linear relationships between EPS content, equilibrium scour depth, excavation area and timescales of scouring offer a simple index on which to modify existing scour prediction methods. We conclude that the biostabilisation of sediments with EPS may offer a simple, cost-effective and ecologically sensitive means of reducing scour in a range of contexts including OWFs, bridge piers, pipeline installation, and void filling in rock armour. Biostabilisation may also reduce economic costs through (1) Use of existing site sediments, or waste dredged sediments (2) Reduced fabrication of materials, (3) Lower transport costs, (4) Less dependence on specialist vessels and precise sub-sea assembly. Further, its potential environmental credentials may allow sensitive use of the seabed in marine protection zones across the globe.

Keywords: biostabilisation, EPS, marine, scour

Procedia PDF Downloads 152
23066 Emotion Mining and Attribute Selection for Actionable Recommendations to Improve Customer Satisfaction

Authors: Jaishree Ranganathan, Poonam Rajurkar, Angelina A. Tzacheva, Zbigniew W. Ras

Abstract:

In today’s world, business often depends on the customer feedback and reviews. Sentiment analysis helps identify and extract information about the sentiment or emotion of the of the topic or document. Attribute selection is a challenging problem, especially with large datasets in actionable pattern mining algorithms. Action Rule Mining is one of the methods to discover actionable patterns from data. Action Rules are rules that help describe specific actions to be made in the form of conditions that help achieve the desired outcome. The rules help to change from any undesirable or negative state to a more desirable or positive state. In this paper, we present a Lexicon based weighted scheme approach to identify emotions from customer feedback data in the area of manufacturing business. Also, we use Rough sets and explore the attribute selection method for large scale datasets. Then we apply Actionable pattern mining to extract possible emotion change recommendations. This kind of recommendations help business analyst to improve their customer service which leads to customer satisfaction and increase sales revenue.

Keywords: actionable pattern discovery, attribute selection, business data, data mining, emotion

Procedia PDF Downloads 184
23065 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis

Authors: Shriya Shukla, Lachin Fernando

Abstract:

Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.

Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning

Procedia PDF Downloads 61
23064 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life

Authors: Desplanches Maxime

Abstract:

Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.

Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression

Procedia PDF Downloads 54
23063 An Improved Image Steganography Technique Based on Least Significant Bit Insertion

Authors: Olaiya Folorunsho, Comfort Y. Daramola, Joel N. Ugwu, Lawrence B. Adewole, Olufisayo S. Ekundayo

Abstract:

In today world, there is a tremendous rise in the usage of internet due to the fact that almost all the communication and information sharing is done over the web. Conversely, there is a continuous growth of unauthorized access to confidential data. This has posed a challenge to information security expertise whose major goal is to curtail the menace. One of the approaches to secure the safety delivery of data/information to the rightful destination without any modification is steganography. Steganography is the art of hiding information inside an embedded information. This research paper aimed at designing a secured algorithm with the use of image steganographic technique that makes use of Least Significant Bit (LSB) algorithm for embedding the data into the bit map image (bmp) in order to enhance security and reliability. In the LSB approach, the basic idea is to replace the LSB of the pixels of the cover image with the Bits of the messages to be hidden without destroying the property of the cover image significantly. The system was implemented using C# programming language of Microsoft.NET framework. The performance evaluation of the proposed system was experimented by conducting a benchmarking test for analyzing the parameters like Mean Squared Error (MSE) and Peak Signal to Noise Ratio (PSNR). The result showed that image steganography performed considerably in securing data hiding and information transmission over the networks.

Keywords: steganography, image steganography, least significant bits, bit map image

Procedia PDF Downloads 247