Search results for: time series representation
19905 A Biomimetic Approach for the Multi-Objective Optimization of Kinetic Façade Design
Authors: Do-Jin Jang, Sung-Ah Kim
Abstract:
A kinetic façade responds to user requirements and environmental conditions. In designing a kinetic façade, kinetic patterns play a key role in determining its performance. This paper proposes a biomimetic method for the multi-objective optimization for kinetic façade design. The autonomous decentralized control system is combined with flocking algorithm. The flocking agents are autonomously reacting to sensor values and bring about kinetic patterns changing over time. A series of experiments were conducted to verify the potential and limitations of the flocking based decentralized control. As a result, it could show the highest performance balancing multiple objectives such as solar radiation and openness among the comparison group.Keywords: biomimicry, flocking algorithm, autonomous decentralized control, multi-objective optimization
Procedia PDF Downloads 52019904 Primary Level Teachers’ Response to Gender Representation in Textbook Contents
Authors: Pragya Paneru
Abstract:
This paper explores ten primary teachers’ views on gender representation in primary-level textbooks altogether. Data was collected from the teachers who taught in private schools in Kailali and Kathmandu District. This research uses a semi-structured interview method to obtain information regarding teachers’ attitudes toward gender representations in textbook content. The interview data were analysed by using critical skills of qualitative research analysis methods, as suggested by Saldana and Omasta (2018). The findings revealed that most of the teachers were unaware and regarded gender issues as insignificant to discuss in primary-level classes. Most of them responded to the questions personally and claimed that there were no gender issues in their classrooms. Some of the teachers connected gender issues with contexts other than textbook representations, such as school discrimination in the distribution of salary among male and female teachers, school practices of awarding girls rather than boys as the most disciplined students, following girls’ first rule in the assembly marching, encouraging only girls in the stage shows, and involving students in gender-specific activities such as decorating works for girls and physical tasks for boys. The interview also revealed teachers’ covert gendered attitudes in their remarks. Nevertheless, most of the teachers accepted that gender-biased contents have an impact on learners, and this problem can be solved with more gender-centred research in the education field, discussions, and training to increase awareness regarding gender issues. Agreeing with the suggestion of teachers, this paper recommends proper training and awareness regarding how to confront gender issues in textbooks.Keywords: content analysis, gender equality, school education, critical awareness
Procedia PDF Downloads 9519903 Recycling of End of Life Concrete Based on C2CA Method
Authors: Somayeh Lotfi, Manuel Eggimann, Eckhard Wagner, Radosław Mróz, Jan Deja
Abstract:
One of the main environmental challenges in the construction industry is a strong social force to decrease the bulk transport of the building materials in urban environments. Considering this fact, applying more in-situ recycling technologies for Construction and Demolition Waste (CDW) is an urgent need. The European C2CA project develops a novel concrete recycling technology that can be performed purely mechanically and in situ. The technology consists of a combination of smart demolition, gentle grinding of the crushed concrete in an autogenous mill, and a novel dry classification technology called ADR to remove the fines. The feasibility of this recycling process was examined in demonstration projects involving in total 20,000 tons of End of Life (EOL) concrete from two office towers in Groningen, The Netherlands. This paper concentrates on the second demonstration project of C2CA, where EOL concrete was recycled on an industrial site. After recycling, the properties of the produced Recycled Aggregate (RA) were investigated, and results are presented. An experimental study was carried out on mechanical and durability properties of produced Recycled Aggregate Concrete (RAC) compared to those of the Natural Aggregate Concrete (NAC). The aim was to understand the importance of RA substitution, w/c ratio and type of cement to the properties of RAC. In this regard, two series of reference concrete with strength classes of C25/30 and C45/55 were produced using natural coarse aggregates (rounded and crushed) and natural sand. The RAC series were created by replacing parts of the natural aggregate, resulting in series of concrete with 0%, 20%, 50% and 100% of RA. Results show that the concrete mix design and type of cement have a decisive effect on the properties of RAC. On the other hand, the substitution of RA even at a high percentage replacement level has a minor and manageable impact on the performance of RAC. This result is a good indication towards the feasibility of using RA in structural concrete by modifying the mix design and using a proper type of cement.Keywords: C2CA, ADR, concrete recycling, recycled aggregate, durability
Procedia PDF Downloads 39219902 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components
Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura
Abstract:
This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding
Procedia PDF Downloads 13819901 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques
Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev
Abstract:
Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.Keywords: data analysis, demand modeling, healthcare, medical facilities
Procedia PDF Downloads 14519900 Voting Representation in Social Networks Using Rough Set Techniques
Authors: Yasser F. Hassan
Abstract:
Social networking involves use of an online platform or website that enables people to communicate, usually for a social purpose, through a variety of services, most of which are web-based and offer opportunities for people to interact over the internet, e.g. via e-mail and ‘instant messaging’, by analyzing the voting behavior and ratings of judges in a popular comments in social networks. While most of the party literature omits the electorate, this paper presents a model where elites and parties are emergent consequences of the behavior and preferences of voters. The research in artificial intelligence and psychology has provided powerful illustrations of the way in which the emergence of intelligent behavior depends on the development of representational structure. As opposed to the classical voting system (one person – one decision – one vote) a new voting system is designed where agents with opposed preferences are endowed with a given number of votes to freely distribute them among some issues. The paper uses ideas from machine learning, artificial intelligence and soft computing to provide a model of the development of voting system response in a simulated agent. The modeled development process involves (simulated) processes of evolution, learning and representation development. The main value of the model is that it provides an illustration of how simple learning processes may lead to the formation of structure. We employ agent-based computer simulation to demonstrate the formation and interaction of coalitions that arise from individual voter preferences. We are interested in coordinating the local behavior of individual agents to provide an appropriate system-level behavior.Keywords: voting system, rough sets, multi-agent, social networks, emergence, power indices
Procedia PDF Downloads 39519899 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 8719898 A Graph Library Development Based on the Service-Oriented Architecture: Used for Representation of the Biological Systems in the Computer Algorithms
Authors: Mehrshad Khosraviani, Sepehr Najjarpour
Abstract:
Considering the usage of graph-based approaches in systems and synthetic biology, and the various types of the graphs employed by them, a comprehensive graph library based on the three-tier architecture (3TA) was previously introduced for full representation of the biological systems. Although proposing a 3TA-based graph library, three following reasons motivated us to redesign the graph library based on the service-oriented architecture (SOA): (1) Maintaining the accuracy of the data related to an input graph (including its edges, its vertices, its topology, etc.) without involving the end user: Since, in the case of using 3TA, the library files are available to the end users, they may be utilized incorrectly, and consequently, the invalid graph data will be provided to the computer algorithms. However, considering the usage of the SOA, the operation of the graph registration is specified as a service by encapsulation of the library files. In other words, overall control operations needed for registration of the valid data will be the responsibility of the services. (2) Partitioning of the library product into some different parts: Considering 3TA, a whole library product was provided in general. While here, the product can be divided into smaller ones, such as an AND/OR graph drawing service, and each one can be provided individually. As a result, the end user will be able to select any parts of the library product, instead of all features, to add it to a project. (3) Reduction of the complexities: While using 3TA, several other libraries must be needed to add for connecting to the database, responsibility of the provision of the needed library resources in the SOA-based graph library is entrusted with the services by themselves. Therefore, the end user who wants to use the graph library is not involved with its complexity. In the end, in order to make the library easier to control in the system, and to restrict the end user from accessing the files, it was preferred to use the service-oriented architecture (SOA) over the three-tier architecture (3TA) and to redevelop the previously proposed graph library based on it.Keywords: Bio-Design Automation, Biological System, Graph Library, Service-Oriented Architecture, Systems and Synthetic Biology
Procedia PDF Downloads 31119897 Multi-Objective Production Planning Problem: A Case Study of Certain and Uncertain Environment
Authors: Ahteshamul Haq, Srikant Gupta, Murshid Kamal, Irfan Ali
Abstract:
This case study designs and builds a multi-objective production planning model for a hardware firm with certain & uncertain data. During the time of interaction with the manager of the firm, they indicate some of the parameters may be vague. This vagueness in the formulated model is handled by the concept of fuzzy set theory. Triangular & Trapezoidal fuzzy numbers are used to represent the uncertainty in the collected data. The fuzzy nature is de-fuzzified into the crisp form using well-known defuzzification method via graded mean integration representation method. The proposed model attempts to maximize the production of the firm, profit related to the manufactured items & minimize the carrying inventory costs in both certain & uncertain environment. The recommended optimal plan is determined via fuzzy programming approach, and the formulated models are solved by using optimizing software LINGO 16.0 for getting the optimal production plan. The proposed model yields an efficient compromise solution with the overall satisfaction of decision maker.Keywords: production planning problem, multi-objective optimization, fuzzy programming, fuzzy sets
Procedia PDF Downloads 21519896 Multisensory Science, Technology, Engineering and Mathematics Learning: Combined Hands-on and Virtual Science for Distance Learners of Food Chemistry
Authors: Paulomi Polly Burey, Mark Lynch
Abstract:
It has been shown that laboratory activities can help cement understanding of theoretical concepts, but it is difficult to deliver such an activity to an online cohort and issues such as occupational health and safety in the students’ learning environment need to be considered. Chemistry, in particular, is one of the sciences where practical experience is beneficial for learning, however typical university experiments may not be suitable for the learning environment of a distance learner. Food provides an ideal medium for demonstrating chemical concepts, and along with a few simple physical and virtual tools provided by educators, analytical chemistry can be experienced by distance learners. Food chemistry experiments were designed to be carried out in a home-based environment that 1) Had sufficient scientific rigour and skill-building to reinforce theoretical concepts; 2) Were safe for use at home by university students and 3) Had the potential to enhance student learning by linking simple hands-on laboratory activities with high-level virtual science. Two main components of the resources were developed, a home laboratory experiment component, and a virtual laboratory component. For the home laboratory component, students were provided with laboratory kits, as well as a list of supplementary inexpensive chemical items that they could purchase from hardware stores and supermarkets. The experiments used were typical proximate analyses of food, as well as experiments focused on techniques such as spectrophotometry and chromatography. Written instructions for each experiment coupled with video laboratory demonstrations were used to train students on appropriate laboratory technique. Data that students collected in their home laboratory environment was collated across the class through shared documents, so that the group could carry out statistical analysis and experience a full laboratory experience from their own home. For the virtual laboratory component, students were able to view a laboratory safety induction and advised on good characteristics of a home laboratory space prior to carrying out their experiments. Following on from this activity, students observed laboratory demonstrations of the experimental series they would carry out in their learning environment. Finally, students were embedded in a virtual laboratory environment to experience complex chemical analyses with equipment that would be too costly and sensitive to be housed in their learning environment. To investigate the impact of the intervention, students were surveyed before and after the laboratory series to evaluate engagement and satisfaction with the course. Students were also assessed on their understanding of theoretical chemical concepts before and after the laboratory series to determine the impact on their learning. At the end of the intervention, focus groups were run to determine which aspects helped and hindered learning. It was found that the physical experiments helped students to understand laboratory technique, as well as methodology interpretation, particularly if they had not been in such a laboratory environment before. The virtual learning environment aided learning as it could be utilized for longer than a typical physical laboratory class, thus allowing further time on understanding techniques.Keywords: chemistry, food science, future pedagogy, STEM education
Procedia PDF Downloads 16919895 Qualitative Narrative Framework as Tool for Reduction of Stigma and Prejudice
Authors: Anastasia Schnitzer, Oliver Rehren
Abstract:
Mental health has become an increasingly important topic in society in recent years, not least due to the challenges posed by the corona pandemic. Along with this, the public has become more and more aware that a lack of enlightenment and proper coping mechanisms may result in a notable risk to develop mental disorders. Yet, there are still many biases against those affected, which are further connected to issues of stigmatization and societal exclusion. One of the main strategies to combat these forms of prejudice and stigma is to induce intergroup contact. More specifically, the Intergroup Contact Theory states engaging in certain types of contact with members of marginalized groups may be an effective way to improve attitudes towards these groups. However, due to the persistent prejudice and stigmatization, affected individuals often do not dare to speak openly about their mental disorders, so that intergroup contact often goes unnoticed. As a result, many people only experience conscious contact with individuals with a mental disorder through media. As an analogy to the Intergroup Contact Theory, the Parasocial Contact Hypothesis proposes that repeatedly being exposed to positive media representations of outgroup members can lead to a reduction of negative prejudices and attitudes towards this outgroup. While there is a growing body of research on the merit of this mechanism, measurements often only consist of 'positive' or 'negative' parasocial contact conditions (or examine the valence or quality of the previous contact with the outgroup); meanwhile, more specific conditions are often neglected. The current study aims to tackle this shortcoming. By scrutinizing the potential of contemporary series as a narrative framework of high quality, we strive to elucidate more detailed aspects of beneficial parasocial contact -for the sake of reducing prejudice and stigma towards individuals with mental disorders. Thus, a two-factorial between-subject online panel study with three measurement points was conducted (N = 95). Participants were randomly assigned to one of two groups, having to watch episodes of either a series with a narrative framework of high (Quality-TV) or low quality (Continental-TV), with one-week interval in-between the episodes. Suitable series were determined with the help of a pretest. Prejudice and stigma towards people with mental disorders were measured at the beginning of the study, before and after each episode, and in a final follow-up one week after the last two episodes. Additionally, parasocial interaction (PSI), quality of contact (QoC), and transportation were measured several times. Based on these data, multivariate multilevel analyses were performed in R using the lavaan package. Latent growth models showed moderate to high increases in QoC and PSI as well as small to moderate decreases in stigma and prejudice over time. Multilevel path analysis with individual and group levels further revealed that a qualitative narrative framework leads to a higher quality of contact experience, which then leads to lower prejudice and stigma, with effects ranging from moderate to high.Keywords: prejudice, quality of contact, parasocial contact, narrative framework
Procedia PDF Downloads 8519894 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.Keywords: aerial thermography, data processing, drone, low-cost, point cloud
Procedia PDF Downloads 14519893 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode
Authors: Girish Chavadappanavar
Abstract:
The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).Keywords: climate impact, regression analysis, yield and forecast model, sugar models
Procedia PDF Downloads 7219892 Synchronous Reference Frame and Instantaneous P-Q Theory Based Control of Unified Power Quality Conditioner for Power Quality Improvement of Distribution System
Authors: Ambachew Simreteab Gebremedhn
Abstract:
Context: The paper explores the use of synchronous reference frame theory (SRFT) and instantaneous reactive power theory (IRPT) based control of Unified Power Quality Conditioner (UPQC) for improving power quality in distribution systems. Research Aim: To investigate the performance of different control configurations of UPQC using SRFT and IRPT for mitigating power quality issues in distribution systems. Methodology: The study compares three control techniques (SRFT-IRPT, SRFT-SRFT, IRPT-IRPT) implemented in series and shunt active filters of UPQC. Data is collected under various control algorithms to analyze UPQC performance. Findings: Results indicate the effectiveness of SRFT and IRPT based control techniques in addressing power quality problems such as voltage sags, swells, unbalance, harmonics, and current harmonics in distribution systems. Theoretical Importance: The study provides insights into the application of SRFT and IRPT in improving power quality, specifically in mitigating unbalanced voltage sags, where conventional methods fall short. Data Collection: Data is collected under various control algorithms using simulation in MATLAB Simulink and real-time operation executed with experimental results obtained using RT-LAB. Analysis Procedures: Performance analysis of UPQC under different control algorithms is conducted to evaluate the effectiveness of SRFT and IRPT based control techniques in mitigating power quality issues. Questions Addressed: How do SRFT and IRPT based control techniques compare in improving power quality in distribution systems? What is the impact of using different control configurations on the performance of UPQC? Conclusion: The study demonstrates the efficacy of SRFT and IRPT based control of UPQC in mitigating power quality issues in distribution systems, highlighting their potential for enhancing voltage and current quality.Keywords: power quality, UPQC, shunt active filter, series active filter, non-linear load, RT-LAB, MATLAB
Procedia PDF Downloads 1419891 Analyses of Soil Volatile Contaminants Extraction by Hot Air Injection
Authors: Abraham Dayan
Abstract:
Remediation of soil containing volatile contaminants is often conducted by vapor extraction (SVE) technique. The operation is based on injection of air at ambient temperatures with or without thermal soil warming. Thermal enhancements of soil vapor extraction (TESVE) processes are usually conducted by soil heating, sometimes assisted by added steam injections. The current study addresses a technique which has not received adequate attention and is based on using exclusively hot air as an alternative to the common TESVE practices. To demonstrate the merit of the hot air TESVE technique, a sandy soil containing contaminated water is studied. Numerical and analytical tools were used to evaluate the rate of decontamination processes for various geometries and operating conditions. The governing equations are based on the Darcy law and are applied to an expanding compressible flow within a sandy soil. The equations were solved to determine the minimal time required for complete soil remediation. An approximate closed form solution was developed based on the assumption of local thermodynamic equilibrium and on a linearized representation of temperature dependence of the vapor to air density ratio. The solution is general in nature and offers insight into the governing processes of the soil remediation operation, where self-similar temperature profiles under certain conditions may exist, and the noticeable role of the contaminants evaporation and recondensation processes in affecting the remediation time. Based on analyses of the hot air TESVE technique, it is shown that it is sufficient to heat the air during a certain period of the decontamination process without compromising its full advantage, and thereby, entailing a minimization of the air-heating-energy requirements. This in effect is achieved by regeneration, leaving the energy stored in the soil during the early period of the remediation process to heat the subsequently injected ambient air, which infiltrates through it for the decontamination of the remaining untreated soil zone. The characteristic time required to complete SVE operations are calculated as a function of, both, the injected air temperature and humidity. For a specific set of conditions, it is demonstrated that elevating the injected air temperature by 20oC, the hot air injection technique reduces the soil remediation time by 50%, while requiring 30% of additional energy consumption. Those evaluations clearly unveil the advantage of the hot air SVE process, which for insignificant cost of added air heating energy, the substantial cost expenditures for manpower and equipment utilization are reduced.Keywords: Porous Media, Soil Decontamination, Hot Air, Vapor Extraction
Procedia PDF Downloads 1619890 Atmospheric Circulation Drivers Of Nationally-Aggregated Wind Energy Production Over Greece
Authors: Kostas Philippopoulos, Chris G. Tzanis, Despina Deligiorgi
Abstract:
Climate change adaptation requires the exploitation of renewable energy sources such as wind. However, climate variability can affect the regional wind energy potential and consequently the available wind power production. The goal of the research project is to examine the impact of atmospheric circulation on wind energy production over Greece. In the context of synoptic climatology, the proposed novel methodology employs Self-Organizing Maps for grouping and classifying the atmospheric circulation and nationally-aggregated capacity factor time series for a 30-year period. The results indicate the critical effect of atmospheric circulation on the national aggregated wind energy production values and therefore address the issue of optimum distribution of wind farms for a specific region.Keywords: wind energy, atmospheric circulation, capacity factor, self-organizing maps
Procedia PDF Downloads 16219889 Comparison of Machine Learning Models for the Prediction of System Marginal Price of Greek Energy Market
Authors: Ioannis P. Panapakidis, Marios N. Moschakis
Abstract:
The Greek Energy Market is structured as a mandatory pool where the producers make their bid offers in day-ahead basis. The System Operator solves an optimization routine aiming at the minimization of the cost of produced electricity. The solution of the optimization problem leads to the calculation of the System Marginal Price (SMP). Accurate forecasts of the SMP can lead to increased profits and more efficient portfolio management from the producer`s perspective. Aim of this study is to provide a comparative analysis of various machine learning models such as artificial neural networks and neuro-fuzzy models for the prediction of the SMP of the Greek market. Machine learning algorithms are favored in predictions problems since they can capture and simulate the volatilities of complex time series.Keywords: deregulated energy market, forecasting, machine learning, system marginal price
Procedia PDF Downloads 21619888 Improvement Perturb and Observe for a Fast Response MPPT Applied to Photovoltaic Panel
Authors: Labar Hocine, Kelaiaia Mounia Samira, Mesbah Tarek, Kelaiaia Samia
Abstract:
Maximum power point tracking (MPPT) techniques are used in photovoltaic (PV) systems to maximize the PV array output power by tracking continuously the maximum power point(MPP) which depends on panels temperature and on irradiance conditions. The main drawback of P&O is that, the operating point oscillates around the MPP giving rise to the waste of some amount of available energy; moreover, it is well known that the P&O algorithm can be confused during those time intervals characterized by rapidly changing atmospheric conditions. In this paper, it is shown that in order to limit the negative effects associated to the above drawbacks, the P&O MPPT parameters must be customized to the dynamic behavior of the specific converter adopted. A theoretical analysis allowing the optimal choice of such initial set parameters is also carried out. The fast convergence of the proposal is proven.Keywords: P&O, Taylor’s series, MPPT, photovoltaic panel
Procedia PDF Downloads 58719887 The Influence of Microsilica on the Cluster Cracks' Geometry of Cement Paste
Authors: Maciej Szeląg
Abstract:
The changing nature of environmental impacts, in which cement composites are operating, are causing in the structure of the material a number of phenomena, which result in volume deformation of the composite. These strains can cause composite cracking. Cracks are merging by propagation or intersect to form a characteristic structure of cracks known as the cluster cracks. This characteristic mesh of cracks is crucial to almost all building materials, which are working in service loads conditions. Particularly dangerous for a cement matrix is a sudden load of elevated temperature – the thermal shock. Resulting in a relatively short period of time a large value of a temperature gradient between the outer surface and the material’s interior can result in cracks formation on the surface and in the volume of the material. In the paper, in order to analyze the geometry of the cluster cracks of the cement pastes, the image analysis tools were used. Tested were 4 series of specimens made of two different Portland cement. In addition, two series include microsilica as a substitute for the 10% of the cement. Within each series, specimens were performed in three w/b indicators (water/binder): 0.4; 0.5; 0.6. The cluster cracks were created by sudden loading the samples by elevated temperature of 250°C. Images of the cracked surfaces were obtained via scanning at 2400 DPI. Digital processing and measurements were performed using ImageJ v. 1.46r software. To describe the structure of the cluster cracks three stereological parameters were proposed: the average cluster area - A ̅, the average length of cluster perimeter - L ̅, and the average opening width of a crack between clusters - I ̅. The aim of the study was to identify and evaluate the relationships between measured stereological parameters, and the compressive strength and the bulk density of the modified cement pastes. The tests of the mechanical and physical feature have been carried out in accordance with EN standards. The curves describing the relationships have been developed using the least squares method, and the quality of the curve fitting to the empirical data was evaluated using three diagnostic statistics: the coefficient of determination – R2, the standard error of estimation - Se, and the coefficient of random variation – W. The use of image analysis allowed for a quantitative description of the cluster cracks’ geometry. Based on the obtained results, it was found a strong correlation between the A ̅ and L ̅ – reflecting the fractal nature of the cluster cracks formation process. It was noted that the compressive strength and the bulk density of cement pastes decrease with an increase in the values of the stereological parameters. It was also found that the main factors, which impact on the cluster cracks’ geometry are the cement particles’ size and the general content of the binder in a volume of the material. The microsilica caused the reduction in the A ̅, L ̅ and I ̅ values compared to the values obtained by the classical cement paste’s samples, which is caused by the pozzolanic properties of the microsilica.Keywords: cement paste, cluster cracks, elevated temperature, image analysis, microsilica, stereological parameters
Procedia PDF Downloads 24619886 Efficiency Improvement of REV-Method for Calibration of Phased Array Antennas
Authors: Daniel Hristov
Abstract:
The paper describes the principle of operation, simulation and physical validation of method for simultaneous acquisition of gain and phase states of multiple antenna elements and the corresponding feed lines across a Phased Array Antenna (PAA). The derived values for gain and phase are used for PAA-calibration. The method utilizes the Rotating-Element Electric- Field Vector (REV) principle currently used for gain and phase state estimation of single antenna element across an active antenna aperture. A significant reduction of procedure execution time is achieved with simultaneous setting of different phase delays to multiple phase shifters, followed by a single power measurement. The initial gain and phase states are calculated using spectral and correlation analysis of the measured power series.Keywords: antenna, antenna arrays, calibration, phase measurement, power measurement
Procedia PDF Downloads 13819885 Mapping Crime against Women in India: Spatio-Temporal Analysis, 2001-2012
Authors: Ritvik Chauhan, Vijay Kumar Baraik
Abstract:
Women are most vulnerable to crime despite occupying central position in shaping a society as the first teacher of children. In India too, having equal rights and constitutional safeguards, the incidences of crime against them are large and grave. In this context of crime against women, especially rape has been increasing over time. This paper explores the spatial and temporal aspects of crime against women in India with special reference to rape. It also examines the crime against women with its spatial, socio-economic and demographic associates using related data obtained from the National Crime Records Bureau India, Indian Census and other government sources of the Government of India. The simple statistical, choropleth mapping and other cartographic representation methods have been used to see the crime rates, spatio-temporal patterns of crime, and association of crime with its correlates. The major findings are visible spatial variations across the country and are also in the rising trends in terms of incidence and rates over the reference period. The study also indicates that the geographical associations are somewhat observed. However, selected indicators of socio-economic factors seem to have no significant bearing on crime against women at this level.Keywords: crime against women, crime mapping, trend analysis, society
Procedia PDF Downloads 33419884 Circadian Clock and Subjective Time Perception: A Simple Open Source Application for the Analysis of Induced Time Perception in Humans
Authors: Agata M. Kołodziejczyk, Mateusz Harasymczuk, Pierre-Yves Girardin, Lucie Davidová
Abstract:
Subjective time perception implies connection to cognitive functions, attention, memory and awareness, but a little is known about connections with homeostatic states of the body coordinated by circadian clock. In this paper, we present results from experimental study of subjective time perception in volunteers performing physical activity on treadmill in various phases of their circadian rhythms. Subjects were exposed to several time illusions simulated by programmed timing systems. This study brings better understanding for further improvement of of work quality in isolated areas.Keywords: biological clock, light, time illusions, treadmill
Procedia PDF Downloads 33819883 Econometric Analysis of West African Countries’ Container Terminal Throughput and Gross Domestic Products
Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi
Abstract:
The west African ports have been experiencing large inflow and outflow of containerized cargo in the last decades, and this has created a quest amongst the countries to attain the status of hub port for the sub-region. This study analyzed the relationship between the container throughput and Gross Domestic Products (GDP) of nine west African countries, using Simple Linear Regression (SLR), Polynomial Regression Model (PRM) and Support Vector Machines (SVM) with a time series of 20 years. The results showed that there exists a high correlation between the GDP and container throughput. The model also predicted the container throughput in west Africa for the next 20 years. The findings and recommendations presented in this research will guide policy makers and help improve the management of container ports and terminals in west Africa, thereby boosting the economy.Keywords: container, ports, terminals, throughput
Procedia PDF Downloads 21519882 The Use of Image Analysis Techniques to Describe a Cluster Cracks in the Cement Paste with the Addition of Metakaolinite
Authors: Maciej Szeląg, Stanisław Fic
Abstract:
The impact of elevated temperatures on the construction materials manifests in change of their physical and mechanical characteristics. Stresses and thermal deformations that occur inside the volume of the material cause its progressive degradation as temperature increase. Finally, the reactions and transformations of multiphase structure of cementitious composite cause its complete destruction. A particularly dangerous phenomenon is the impact of thermal shock – a sudden high temperature load. The thermal shock leads to a high value of the temperature gradient between the outer surface and the interior of the element in a relatively short time. The result of mentioned above process is the formation of the cracks and scratches on the material’s surface and inside the material. The article describes the use of computer image analysis techniques to identify and assess the structure of the cluster cracks on the surfaces of modified cement pastes, caused by thermal shock. Four series of specimens were tested. Two Portland cements were used (CEM I 42.5R and CEM I 52,5R). In addition, two of the series contained metakaolinite as a replacement for 10% of the cement content. Samples in each series were made in combination of three w/b (water/binder) indicators of respectively 0.4; 0.5; 0.6. Surface cracks of the samples were created by a sudden temperature load at 200°C for 4 hours. Images of the cracked surfaces were obtained via scanning at 1200 DPI; digital processing and measurements were performed using ImageJ v. 1.46r software. In order to examine the cracked surface of the cement paste as a system of closed clusters – the dispersal systems theory was used to describe the structure of cement paste. Water is used as the dispersing phase, and the binder is used as the dispersed phase – which is the initial stage of cement paste structure creation. A cluster itself is considered to be the area on the specimen surface that is limited by cracks (created by sudden temperature loading) or by the edge of the sample. To describe the structure of cracks two stereological parameters were proposed: A ̅ – the cluster average area, L ̅ – the cluster average perimeter. The goal of this study was to compare the investigated stereological parameters with the mechanical properties of the tested specimens. Compressive and tensile strength testes were carried out according to EN standards. The method used in the study allowed the quantitative determination of defects occurring in the examined modified cement pastes surfaces. Based on the results, it was found that the nature of the cracks depends mainly on the physical parameters of the cement and the intermolecular interactions on the dispersal environment. Additionally, it was noted that the A ̅/L ̅ relation of created clusters can be described as one function for all tested samples. This fact testifies about the constant geometry of the thermal cracks regardless of the presence of metakaolinite, the type of cement and the w/b ratio.Keywords: cement paste, cluster cracks, elevated temperature, image analysis, metakaolinite, stereological parameters
Procedia PDF Downloads 39019881 Film Review of 'Heroic Saviours and Survivors': The Representation of Sex Trafficking in Popular Films in India
Authors: Nisha James, Shubha Ranganathan
Abstract:
One of the most poignant forms of organized crime against women, which has rarely made it to the world of Indian cinema, is that of sex trafficking, i.e. the forcible involvement of women in the sex trade through fraud or coercion (Hughes, 2005). In the space of Indian cinema, much of the spotlight has been on the sensational drug trafficking and gang mafia of Bombay. During our research on sex trafficking, the rehabilitated women interviewed often expressed strong criticism about mass media’s naive portrayal of prostitutes as money-minting, happy and sexually driven women. They argued that this unrealistic portrayal ignored the fact that this was not a reality for the majority of trafficked women. Given the gravity of sex trafficking as a human rights issue, it is, therefore, refreshing to see three recent films on sex trafficking in Indian Languages – Naa Bangaaru Talli (2014, Telugu), Mardaani (2014, Hindi) and Lakshmi (2014, Hindi). This paper reviews these three films to explore the portrayal of the everyday reality of trafficking for women. Film analysis was used to understand the representation of psychological issues in the media. The strength of these movies starts with their inspirations which are of true stories and that they are all aimed at bringing awareness about the issue of sex trafficking, which is a rising social evil in Indian society though none of the three films move to portray the next phase of rehabilitation and reintegration of victims, which is a very complex and important process in the life of a survivor. According to findings, survivors of sex trafficking find the rehabilitation and reintegration into society to be a slow and tough part of their life as they continuously face stigma and social exclusion and have to strive to live against all odds of non-acceptance starting from their family.Keywords: film review, Indian films, sex trafficking, survivors
Procedia PDF Downloads 44019880 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software
Authors: Chandra Mukherjee
Abstract:
The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction
Procedia PDF Downloads 41119879 Detection of Parkinsonian Freezing of Gait
Authors: Sang-Hoon Park, Yeji Ho, Gwang-Moon Eom
Abstract:
Fast and accurate detection of Freezing of Gait (FOG) is desirable for appropriate application of cueing which has been shown to ameliorate FOG. Utilization of frequency spectrum of leg acceleration to derive the freeze index requires much calculation and it would lead to delayed cueing. We hypothesized that FOG can be reasonably detected from the time domain amplitude of foot acceleration. A time instant was recognized as FOG if the mean amplitude of the acceleration in the time window surrounding the time instant was in the specific FOG range. Parameters required in the FOG detection was optimized by simulated annealing. The suggested time domain methods showed performances comparable to those of frequency domain methods.Keywords: freezing of gait, detection, Parkinson's disease, time-domain method
Procedia PDF Downloads 44519878 Didactics of Literature within the Brechtian Theatre in Edward Albee's Who's Afraid of Virginia Woolf? and Ernest Lehman's Screenplay Adaptation from an Audiovisual Perspective
Authors: Angel Mauricio Castillo
Abstract:
The background to the way theatrical performances and music dramas- as they were known in the mid-nineteenth century, provided the audience with a complete immersion into the feelings of the characters through poetry, music and other artistic representations which create a false sense of reality. However, a novel representation on stage some eighty years later, which is non-cathartic, is significant because it represents the antithesis to the common creations of the period and is originated by the separation of the elements as a dominant. A succinct description of the basic methodologies includes the sense of defamiliarization that results as a near translation of the German word Verfremdung will be referred to along this work as the V-effect (also known as the ‘alienation effect’) and will embody the representation of the performing techniques that enables the audience to watch a play being fully aware of its nature. A play might sometimes present the audience with a constant reminder that it is only a play; therefore, all elements will be introduced to provoke dissimilar reactions and opinions. A clear indication of the major findings of the study is that there is a strong correlation between Hegel, Marx and Brecht as it is disclosed how the didactics of Literature have been influencing not only Brecht’s productions but also every educational context in which these ideas are intertwined. The result is a new dialectical process that is to say, a new thesis that creates independent thinking skills on the part of the audience. Therefore, this model opposes to the Hegelian formula thesis-antithesis-synthesis in that the synthesis in the Brechtian theatre will inevitably fall into the category of a different thesis within an enlightening type of discourse. The confronting ideas of illusion versus reality will create a new dialectical thesis instead of resulting into a synthesis.Keywords: Brechtian theatre, didactics, literature, education
Procedia PDF Downloads 18119877 Investigation of Microstructure and Mechanical Properties of Friction Stir Welded Dissimilar Aluminium Alloys
Authors: Gurpreet Singh, Hazoor Singh, Kulbir Singh Sandhu
Abstract:
Friction Stir Welding Process emerged as promising solid-state welding and eliminates various welding defects like cracks and porosity in joining of dissimilar aluminum alloys. In the present research, Friction Stir Welding (FSW) is carried out on dissimilar aluminum alloys 2000 series and 6000 series this combination of alloys are highly used in automobile and aerospace industry due to their good strength to weight ratio, mechanical, and corrosion properties. The joints characterized by applying various destructive and non-destructive tests. Three critical welding parameters were considered i.e. Tool Rotation speed, Transverse speed, and Tool Geometry. The effective range of tool rotation speed from 1200-1800 rpm and transverse speed from 60-240 mm/min and tool geometry was studied. The two-different difficult to weld alloys were successfully welded. All the samples showed different microstructure with different set of welding parameters. It has been revealed with microstructure scans that grain refinement plays a crucial role in mechanical properties.Keywords: aluminum alloys, friction stir welding, mechanical properties, microstructure
Procedia PDF Downloads 28419876 The Effect of Regulation and Investment in Sustainable Practices on Environmental Performance and Consumer Trust: a Time Series Analysis of the Dominant Companies within the Energy Sector
Authors: Sempiga Olivier, Dominika Latusek-Jurczak
Abstract:
Climate change has allegedly been attributed to a high consumption of fossil fuels, leading to severe environmental problems. The energy sector has been among the most polluting sectors for many decades. Consequently, there is a lack of trust in several energy firms, especially those in fossil fuels and nuclear energy. A robust regulatory framework is needed, and more investment in renewable energy sources is paramount for a better environmental outcome. Given the significant environmental impact of energy production and consumption in the energy sector, sustainable marketing practices have become increasingly important. Although the latter has had the lion’s share in polluting the environment, much effort has been made recently to move away from fossil fuels and privilege renewable energy sources. How this shift would help rebuild trust in the energy industry is unclear. For the shift to have lasting effects, it may be essential that regulatory agencies examine how energy firms engage in sustainable investment. There is little empirical evidence on whether adopting regulating marketing practices and investment initiatives can help different organizations reduce their environmental impact and promote sustainable development. Little is known about how and whether the environmental value in firms goes beyond rhetoric, greenwashing and publicity to translate into economic gains and environmental performance. The study investigates how regulatory agencies can help energy firms invest sustainably and take sustainable initiatives even amid the energy crisis caused by the Russia-Ukraine conflict and how these sustainable practices relate to renewed consumer trust. Using data from Corporate Knights, the study, through time series, analyses the relationship between sustainable regulation, sustainable practices of energy firms from around the world and their relation to consumer trust and environmental performance over the past 20 years. It examines how their sustainable investment, energy, and carbon productivity relate to environmental sustainability and consumer trust. This longitudinal study provides empirical evidence of the interplay between regulation, trust and environmental performance. The research is grounded in institutional trust theory, which emphasizes the role of regulatory frameworks and organizational practices in shaping public perceptions of fairness, transparency, and legitimacy. Results show that organizations in the energy sector, supported by robust regulatory tools, can overcome the negative image of polluters and compete with other companies in the fight against climate change and global warming. However, to do so, energy firms should consider investing more in renewable energy sources and implementing sustainable strategies and practices that go beyond greenwashing to improve their environmental performance, thereby rebuilding consumer trust in the energy sector. Results allow regulatory regimes and organizations to learn why it is crucial for energy firms to invest in renewable energy sources and engage in various sustainable initiatives and practices to contribute to better environmental outcomes and higher levels of trust.Keywords: consumer trust, energy, environmental performance, regulation, renewable energy sources, sustainable practices
Procedia PDF Downloads 15