Search results for: co-authorship network analysis
28663 Pricing Techniques to Mitigate Recurring Congestion on Interstate Facilities Using Dynamic Feedback Assignment
Authors: Hatem Abou-Senna
Abstract:
Interstate 4 (I-4) is a primary east-west transportation corridor between Tampa and Daytona cities, serving commuters, commercial and recreational traffic. I-4 is known to have severe recurring congestion during peak hours. The congestion spans about 11 miles in the evening peak period in the central corridor area as it is considered the only non-tolled limited access facility connecting the Orlando Central Business District (CBD) and the tourist attractions area (Walt Disney World). Florida officials had been skeptical of tolling I-4 prior to the recent legislation, and the public through the media had been complaining about the excessive toll facilities in Central Florida. So, in search for plausible mitigation to the congestion on the I-4 corridor, this research is implemented to evaluate the effectiveness of different toll pricing alternatives that might divert traffic from I-4 to the toll facilities during the peak period. The network is composed of two main diverging limited access highways, freeway (I-4) and toll road (SR 417) in addition to two east-west parallel toll roads SR 408 and SR 528, intersecting the above-mentioned highways from both ends. I-4 and toll road SR 408 are the most frequently used route by commuters. SR-417 is a relatively uncongested toll road with 15 miles longer than I-4 and $5 tolls compared to no monetary cost on 1-4 for the same trip. The results of the calibrated Orlando PARAMICS network showed that percentages of route diversion vary from one route to another and depends primarily on the travel cost between specific origin-destination (O-D) pairs. Most drivers going from Disney (O1) or Lake Buena Vista (O2) to Lake Mary (D1) were found to have a high propensity towards using I-4, even when eliminating tolls and/or providing real-time information. However, a diversion from I-4 to SR 417 for these OD pairs occurred only in the cases of the incident and lane closure on I-4, due to the increase in delay and travel costs, and when information is provided to travelers. Furthermore, drivers that diverted from I-4 to SR 417 and SR 528 did not gain significant travel-time savings. This was attributed to the limited extra capacity of the alternative routes in the peak period and the longer traveling distance. When the remaining origin-destination pairs were analyzed, average travel time savings on I-4 ranged between 10 and 16% amounting to 10 minutes at the most with a 10% increase in the network average speed. High propensity of diversion on the network increased significantly when eliminating tolls on SR 417 and SR 528 while doubling the tolls on SR 408 along with the incident and lane closure scenarios on I-4 and with real-time information provided. The toll roads were found to be a viable alternative to I-4 for these specific OD pairs depending on the user perception of the toll cost which was reflected in their specific travel times. However, on the macroscopic level, it was concluded that route diversion through toll reduction or elimination on surrounding toll roads would only have a minimum impact on reducing I-4 congestion during the peak period.Keywords: congestion pricing, dynamic feedback assignment, microsimulation, paramics, route diversion
Procedia PDF Downloads 18028662 Structural Performance Evaluation of Power Boiler for the Pressure Release Valve in Consideration of the Thermal Expansion
Authors: Young-Hun Lee, Tae-Gwan Kim, Jong-Kyu Kim, Young-Chul Park
Abstract:
In this study, Spring safety valve Heat - structure coupled analysis was carried out. Full analysis procedure and performing thermal analysis at a maximum temperature, them to the results obtained through to give an additional load and the pressure on the valve interior, and Disc holder Heat-Coupled structure Analysis was carried out. Modeled using a 3D design program Solidworks, For the modeling of the safety valve was used 3D finite element analysis program ANSYS. The final result to be obtained through the Analysis examined the stability of the maximum displacement and the maximum stress to the valve internal components occurring in the high-pressure conditions.Keywords: finite element method, spring safety valve, gap, stress, strain, deformation
Procedia PDF Downloads 37128661 Graph Based Traffic Analysis and Delay Prediction Using a Custom Built Dataset
Authors: Gabriele Borg, Alexei Debono, Charlie Abela
Abstract:
There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale. Furthermore, a series of traffic prediction graph neural network models are conducted to compare MalTra to large-scale traffic datasets.Keywords: graph neural networks, traffic management, big data, mobile data patterns
Procedia PDF Downloads 13328660 Emotion-Convolutional Neural Network for Perceiving Stress from Audio Signals: A Brain Chemistry Approach
Authors: Anup Anand Deshmukh, Catherine Soladie, Renaud Seguier
Abstract:
Emotion plays a key role in many applications like healthcare, to gather patients’ emotional behavior. Unlike typical ASR (Automated Speech Recognition) problems which focus on 'what was said', it is equally important to understand 'how it was said.' There are certain emotions which are given more importance due to their effectiveness in understanding human feelings. In this paper, we propose an approach that models human stress from audio signals. The research challenge in speech emotion detection is finding the appropriate set of acoustic features corresponding to an emotion. Another difficulty lies in defining the very meaning of emotion and being able to categorize it in a precise manner. Supervised Machine Learning models, including state of the art Deep Learning classification methods, rely on the availability of clean and labelled data. One of the problems in affective computation is the limited amount of annotated data. The existing labelled emotions datasets are highly subjective to the perception of the annotator. We address the first issue of feature selection by exploiting the use of traditional MFCC (Mel-Frequency Cepstral Coefficients) features in Convolutional Neural Network. Our proposed Emo-CNN (Emotion-CNN) architecture treats speech representations in a manner similar to how CNN’s treat images in a vision problem. Our experiments show that Emo-CNN consistently and significantly outperforms the popular existing methods over multiple datasets. It achieves 90.2% categorical accuracy on the Emo-DB dataset. We claim that Emo-CNN is robust to speaker variations and environmental distortions. The proposed approach achieves 85.5% speaker-dependant categorical accuracy for SAVEE (Surrey Audio-Visual Expressed Emotion) dataset, beating the existing CNN based approach by 10.2%. To tackle the second problem of subjectivity in stress labels, we use Lovheim’s cube, which is a 3-dimensional projection of emotions. Monoamine neurotransmitters are a type of chemical messengers in the brain that transmits signals on perceiving emotions. The cube aims at explaining the relationship between these neurotransmitters and the positions of emotions in 3D space. The learnt emotion representations from the Emo-CNN are mapped to the cube using three component PCA (Principal Component Analysis) which is then used to model human stress. This proposed approach not only circumvents the need for labelled stress data but also complies with the psychological theory of emotions given by Lovheim’s cube. We believe that this work is the first step towards creating a connection between Artificial Intelligence and the chemistry of human emotions.Keywords: deep learning, brain chemistry, emotion perception, Lovheim's cube
Procedia PDF Downloads 15728659 Fault Tree Analysis (FTA) of CNC Turning Center
Authors: R. B. Patil, B. S. Kothavale, L. Y. Waghmode
Abstract:
Today, the CNC turning center becomes an important machine tool for manufacturing industry worldwide. However, as the breakdown of a single CNC turning center may result in the production of an entire plant being halted. For this reason, operations and preventive maintenance have to be minimized to ensure availability of the system. Indeed, improving the availability of the CNC turning center as a whole, objectively leads to a substantial reduction in production loss, operating, maintenance and support cost. In this paper, fault tree analysis (FTA) method is used for reliability analysis of CNC turning center. The major faults associated with the system and the causes for the faults are presented graphically. Boolean algebra is used for evaluating fault tree (FT) diagram and for deriving governing reliability model for CNC turning center. Failure data over a period of six years has been collected and used for evaluating the model. Qualitative and quantitative analysis is also carried out to identify critical sub-systems and components of CNC turning center. It is found that, at the end of the warranty period (one year), the reliability of the CNC turning center as a whole is around 0.61628.Keywords: fault tree analysis (FTA), reliability analysis, risk assessment, hazard analysis
Procedia PDF Downloads 41828658 A Mobile Application for Analyzing and Forecasting Crime Using Autoregressive Integrated Moving Average with Artificial Neural Network
Authors: Gajaanuja Megalathan, Banuka Athuraliya
Abstract:
Crime is one of our society's most intimidating and threatening challenges. With the majority of the population residing in cities, many experts and data provided by local authorities suggest a rapid increase in the number of crimes committed in these cities in recent years. There has been an increasing graph in the crime rates. People living in Sri Lanka have the right to know the exact crime rates and the crime rates in the future of the place they are living in. Due to the current economic crisis, crime rates have spiked. There have been so many thefts and murders recorded within the last 6-10 months. Although there are many sources to find out, there is no solid way of searching and finding out the safety of the place. Due to all these reasons, there is a need for the public to feel safe when they are introduced to new places. Through this research, the author aims to develop a mobile application that will be a solution to this problem. It is mainly targeted at tourists, and people who recently relocated will gain advantage of this application. Moreover, the Arima Model combined with ANN is to be used to predict crime rates. From the past researchers' works, it is evidently clear that they haven’t used the Arima model combined with Artificial Neural Networks to forecast crimes.Keywords: arima model, ANN, crime prediction, data analysis
Procedia PDF Downloads 13528657 Resilient Analysis as an Alternative to Conventional Seismic Analysis Methods for the Maintenance of a Socioeconomical Functionality of Structures
Authors: Sara Muhammad Elqudah, Vigh László Gergely
Abstract:
Catastrophic events, such as earthquakes, are sudden, short, and devastating, threatening lives, demolishing futures, and causing huge economic losses. Current seismic analyses and design standards are based on life safety levels where only some residual strength and stiffness are left in the structure leaving it beyond economical repair. Consequently, it has become necessary to introduce and implement the concept of resilient design. Resilient design is about designing for ductility over time by resisting, absorbing, and recovering from the effects of a hazard in an appropriate and efficient time manner while maintaining the functionality of the structure in the aftermath of the incident. Resilient analysis is mainly based on the fragility, vulnerability, and functionality curves where eventually a resilience index is generated from these curves, and the higher this index is, the better is the performance of the structure. In this paper, seismic performances of a simple two story reinforced concrete building, located in a moderate seismic region, has been evaluated using the conventional seismic analyses methods, which are the linear static analysis, the response spectrum analysis, and the pushover analysis, and the generated results of these analyses methods are compared to those of the resilient analysis. Results highlight that the resilience analysis was the most convenient method in generating a more ductile and functional structure from a socio-economic perspective, in comparison to the standard seismic analysis methods.Keywords: conventional analysis methods, functionality, resilient analysis, seismic performance
Procedia PDF Downloads 11828656 Developing a Spatial Transport Model to Determine Optimal Routes When Delivering Unprocessed Milk
Authors: Sunday Nanosi Ndovi, Patrick Albert Chikumba
Abstract:
In Malawi, smallholder dairy farmers transport unprocessed milk to sell at Milk Bulking Groups (MBGs). MBGs store and chill the milk while awaiting collection by processors. The farmers deliver milk using various modes of transportation such as foot, bicycle, and motorcycle. As a perishable food, milk requires timely transportation to avoid deterioration. In other instances, some farmers bypass the nearest MBGs for facilities located further away. Untimely delivery worsens quality and results in rejection at MBG. Subsequently, these rejections lead to revenue losses for dairy farmers. Therefore, the objective of this study was to optimize routes when transporting milk by selecting the shortest route using time as a cost attribute in Geographic Information Systems (GIS). A spatially organized transport system impedes milk deterioration while promoting profitability for dairy farmers. A transportation system was modeled using Route Analysis and Closest Facility network extensions. The final output was to find the quickest routes and identify the nearest milk facilities from incidents. Face-to-face interviews targeted leaders from all 48 MBGs in the study area and 50 farmers from Namahoya MBG. During field interviews, coordinates were captured in order to create maps. Subsequently, maps supported the selection of optimal routes based on the least travel times. The questionnaire targeted 200 respondents. Out of the total, 182 respondents were available. Findings showed that out of the 50 sampled farmers that supplied milk to Namahoya, only 8% were nearest to the facility, while 92% were closest to 9 different MBGs. Delivering milk to the nearest MBGs would minimize travel time and distance by 14.67 hours and 73.37 km, respectively.Keywords: closest facility, milk, route analysis, spatial transport
Procedia PDF Downloads 5928655 Data and Biological Sharing Platforms in Community Health Programs: Partnership with Rural Clinical School, University of New South Wales and Public Health Foundation of India
Authors: Vivian Isaac, A. T. Joteeshwaran, Craig McLachlan
Abstract:
The University of New South Wales (UNSW) Rural Clinical School has a strategic collaborative focus on chronic disease and public health. Our objectives are to understand rural environmental and biological interactions in vulnerable community populations. The UNSW Rural Clinical School translational model is a spoke and hub network. This spoke and hub model connects rural data and biological specimens with city based collaborative public health research networks. Similar spoke and hub models are prevalent across research centers in India. The Australia-India Council grant was awarded so we could establish sustainable public health and community research collaborations. As part of the collaborative network we are developing strategies around data and biological sharing platforms between Indian Institute of Public Health, Public Health Foundation of India (PHFI), Hyderabad and Rural Clinical School UNSW. The key objective is to understand how research collaborations are conducted in India and also how data can shared and tracked with external collaborators such as ourselves. A framework to improve data sharing for research collaborations, including DNA was proposed as a project outcome. The complexities of sharing biological data has been investigated via a visit to India. A flagship sustainable project between Rural Clinical School UNSW and PHFI would illustrate a model of data sharing platforms.Keywords: data sharing, collaboration, public health research, chronic disease
Procedia PDF Downloads 45128654 Crack Width Analysis of Reinforced Concrete Members under Shrinkage Effect by Pseudo-Discrete Crack Model
Authors: F. J. Ma, A. K. H. Kwan
Abstract:
Crack caused by shrinkage movement of concrete is a serious problem especially when restraint is provided. It may cause severe serviceability and durability problems. The existing prediction methods for crack width of concrete due to shrinkage movement are mainly numerical methods under simplified circumstances, which do not agree with each other. To get a more unified prediction method applicable to more sophisticated circumstances, finite element crack width analysis for shrinkage effect should be developed. However, no existing finite element analysis can be carried out to predict the crack width of concrete due to shrinkage movement because of unsolved reasons of conventional finite element analysis. In this paper, crack width analysis implemented by finite element analysis is presented with pseudo-discrete crack model, which combines traditional smeared crack model and newly proposed crack queuing algorithm. The proposed pseudo-discrete crack model is capable of simulating separate and single crack without adopting discrete crack element. And the improved finite element analysis can successfully simulate the stress redistribution when concrete is cracked, which is crucial for predicting crack width, crack spacing and crack number.Keywords: crack queuing algorithm, crack width analysis, finite element analysis, shrinkage effect
Procedia PDF Downloads 41928653 Approximation of a Wanted Flow via Topological Sensitivity Analysis
Authors: Mohamed Abdelwahed
Abstract:
We propose an optimization algorithm for the geometric control of fluid flow. The used approach is based on the topological sensitivity analysis method. It consists in studying the variation of a cost function with respect to the insertion of a small obstacle in the domain. Some theoretical and numerical results are presented in 2D and 3D.Keywords: sensitivity analysis, topological gradient, shape optimization, stokes equations
Procedia PDF Downloads 54028652 Supporting 'vulnerable' Students to Complete Their Studies During the Economic Crisis in Greece: The Umbrella Program of International Hellenic University
Authors: Rigas Kotsakis, Nikolaos Tsigilis, Vasilis Grammatikopoulos, Evridiki Zachopoulou
Abstract:
During the last decade, Greece has faced an unprecedented financial crisis, affecting various aspects and functionalities of Higher Education. Besides the restricted funding of academic institutions, the students and their families encountered economical difficulties that undoubtedly influenced the effective completion of their studies. In this context, a fairly large number of students in Alexander campus of International Hellenic University (IHU) delay, interrupt, or even abandon their studies, especially when they come from low-income families, belong to sensitive social or special needs groups, they have different cultural origins, etc. For this reason, a European project, named “Umbrella”, was initiated aiming at providing the necessary psychological support and counseling, especially to disadvantaged students, towards the completion of their studies. To this end, a network of various academic members (academic staff and students) from IHU, namely iMentor, were implicated in different roles. Specifically, experienced academic staff trained students to serve as intermediate links for the integration and educational support of students that fall into the aforementioned sensitive social groups and face problems for the completion of their studies. The main idea of the project is held upon its person-centered character, which facilitates direct student-to-student communication without the intervention of the teaching staff. The backbone of the iMentors network are senior students that face no problem in their academic life and volunteered for this project. It should be noted that there is a provision from the Umbrella structure for substantial and ethical rewards for their engagement. In this context, a well-defined, stringent methodology was implemented for the evaluation of the extent of the problem in IHU and the detection of the profile of the “candidate” disadvantaged students. The first phase included two steps, (a) data collection and (b) data cleansing/ preprocessing. The first step involved the data collection process from the Secretary Services of all Schools in IHU, from 1980 to 2019, which resulted in 96.418 records. The data set included the School name, the semester of studies, a student enrolling criteria, the nationality, the graduation year or the current, up-to-date academic state (still studying, delayed, dropped off, etc.). The second step of the employed methodology involved the data cleansing/preprocessing because of the existence of “noisy” data, missing and erroneous values, etc. Furthermore, several assumptions and grouping actions were imposed to achieve data homogeneity and an easy-to-interpret subsequent statistical analysis. Specifically, the duration of 40 years recording was limited to the last 15 years (2004-2019). In 2004 the Greek Technological Institutions were evolved into Higher Education Universities, leading into a stable and unified frame of graduate studies. In addition, the data concerning active students were excluded from the analysis since the initial processing effort was focused on the detection of factors/variables that differentiated graduate and deleted students. The final working dataset included 21.432 records with only two categories of students, those that have a degree and those who abandoned their studies. Findings of the first phase are presented across faculties and further discussed.Keywords: higher education, students support, economic crisis, mentoring
Procedia PDF Downloads 11628651 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform
Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry
Abstract:
The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems
Procedia PDF Downloads 49228650 Integrating a Security Operations Centre with an Organization’s Existing Procedures, Policies and Information Technology Systems
Authors: M. Mutemwa
Abstract:
A Cybersecurity Operation Centre (SOC) is a centralized hub for network event monitoring and incident response. SOCs are critical when determining an organization’s cybersecurity posture because they can be used to detect, analyze and report on various malicious activities. For most organizations, a SOC is not part of the initial design and implementation of the Information Technology (IT) environment but rather an afterthought. As a result, it is not natively a plug and play component; therefore, there are integration challenges when a SOC is introduced into an organization. A SOC is an independent hub that needs to be integrated with existing procedures, policies and IT systems of an organization such as the service desk, ticket logging system, reporting, etc. This paper discussed the challenges of integrating a newly developed SOC to an organization’s existing IT environment. Firstly, the paper begins by looking at what data sources should be incorporated into the Security Information and Event Management (SIEM) such as which host machines, servers, network end points, software, applications, web servers, etc. for security posture monitoring. That is which systems need to be monitored first and the order by which the rest of the systems follow. Secondly, the paper also describes how to integrate the organization’s ticket logging system with the SOC SIEM. That is how the cybersecurity related incidents should be logged by both analysts and non-technical employees of an organization. Also the priority matrix for incident types and notifications of incidents. Thirdly, the paper looks at how to communicate awareness campaigns from the SOC and also how to report on incidents that are found inside the SOC. Lastly, the paper looks at how to show value for the large investments that are poured into designing, building and running a SOC.Keywords: cybersecurity operation centre, incident response, priority matrix, procedures and policies
Procedia PDF Downloads 15528649 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia
Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman
Abstract:
Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh
Procedia PDF Downloads 22628648 A Case Study on Re-Assessment Study of an Earthfill Dam at Latamber, Pakistan
Authors: Afnan Ahmad, Shahid Ali, Mujahid Khan
Abstract:
This research presents the parametric study of an existing earth fill dam located at Latamber, Karak city, Pakistan. The study consists of carrying out seepage analysis, slope stability analysis, and Earthquake analysis of the dam for the existing dam geometry and do the same for modified geometry. Dams are massive as well as expensive hydraulic structure, therefore it needs proper attention. Additionally, this dam falls under zone 2B region of Pakistan, which is an earthquake-prone area and where ground accelerations range from 0.16g to 0.24g peak. So it should be deal with great care, as the failure of any dam can cause irreparable losses. Similarly, seepage as well as slope failure can also cause damages which can lead to failure of the dam. Therefore, keeping in view of the importance of dam construction and associated costs, our main focus is to carry out parametric study of newly constructed dam. GeoStudio software is used for this analysis in the study in which Seep/W is used for seepage analysis, Slope/w is used for Slope stability analysis and Quake/w is used for earthquake analysis. Based on the geometrical, hydrological and geotechnical data, Seepage and slope stability analysis of different proposed geometries of the dam are carried out along with the Seismic analysis. A rigorous analysis was carried out in 2-D limit equilibrium using finite element analysis. The seismic study began with the static analysis, continuing by the dynamic response analysis. The seismic analyses permitted evaluation of the overall patterns of the Latamber dam behavior in terms of displacements, stress, strain, and acceleration fields. Similarly, the seepage analysis allows evaluation of seepage through the foundation and embankment of the dam, while slope stability analysis estimates the factor of safety of the upstream and downstream of the dam. The results of the analysis demonstrate that among multiple geometries, Latamber dam is secure against seepage piping failure and slope stability (upstream and downstream) failure. Moreover, the dam is safe against any dynamic loading and no liquefaction has been observed while changing its geometry in permissible limits.Keywords: earth-fill dam, finite element, liquefaction, seepage analysis
Procedia PDF Downloads 16428647 A Case Study Approach on Co-Constructing the Idea of 'Safety' with Children
Authors: Beng Zhen Yeow
Abstract:
In most work that involves children, the voice of the children is often not heard. This is ironic since a lot of discussions might involve their welfare and safety. It might seem natural that the professionals should hear from them about what they wish for instead of deciding what is best for them. However, this, unfortunately, might be more the exception than the norm in most case and hence in many instances, children are merely 'subjects' in conversations about safety instead of active participants in the construction or creation of safety in the family. There might be many reasons why it does not happen in our work. Firstly, professionals have learnt how to 'socialise' into their professional roles and hence in the process become 'un-childlike'. Secondly, there is also a lack of professional training with regards to how to talk with children. Finally, there might be also a lack of concrete tools and techniques that are developed to facilitate the process. In this paper, the case study method is used to show how the idea of safety could be concretised and discussed with children and their family members, and hence making them active participants and co-creators of their own safety. Specific skills and techniques are highlighted through the case study. In this case, there was improvement in outcomes like no repeated offence or abuse. In addition, children were also able to advocate for their own safety after six months of intervention and how the family members were able to explicitly say what they can do to improve safety. The professionals in the safety network reported significant improvements. On top of that, the abused child who was removed due to child protection concerns, had verbalized observations of change in mother’s parenting abilities, and has requested for home leave to begin due to ownership of safety planning and having confidence to co-create safety for her siblings and herself together with the professionals in the safety network. Children becoming active participants in the co-creation of safety not only serve the purpose in allowing them to own a 'voice' but at the same time, give them greater confidence to protect themselves at home and in other contexts outside of home.Keywords: partnering for safety, collaborative social work, family and systemic psychotherapy, child protection
Procedia PDF Downloads 12228646 Lean Comic GAN (LC-GAN): a Light-Weight GAN Architecture Leveraging Factorized Convolution and Teacher Forcing Distillation Style Loss Aimed to Capture Two Dimensional Animated Filtered Still Shots Using Mobile Phone Camera and Edge Devices
Authors: Kaustav Mukherjee
Abstract:
In this paper we propose a Neural Style Transfer solution whereby we have created a Lightweight Separable Convolution Kernel Based GAN Architecture (SC-GAN) which will very useful for designing filter for Mobile Phone Cameras and also Edge Devices which will convert any image to its 2D ANIMATED COMIC STYLE Movies like HEMAN, SUPERMAN, JUNGLE-BOOK. This will help the 2D animation artist by relieving to create new characters from real life person's images without having to go for endless hours of manual labour drawing each and every pose of a cartoon. It can even be used to create scenes from real life images.This will reduce a huge amount of turn around time to make 2D animated movies and decrease cost in terms of manpower and time. In addition to that being extreme light-weight it can be used as camera filters capable of taking Comic Style Shots using mobile phone camera or edge device cameras like Raspberry Pi 4,NVIDIA Jetson NANO etc. Existing Methods like CartoonGAN with the model size close to 170 MB is too heavy weight for mobile phones and edge devices due to their scarcity in resources. Compared to the current state of the art our proposed method which has a total model size of 31 MB which clearly makes it ideal and ultra-efficient for designing of camera filters on low resource devices like mobile phones, tablets and edge devices running OS or RTOS. .Owing to use of high resolution input and usage of bigger convolution kernel size it produces richer resolution Comic-Style Pictures implementation with 6 times lesser number of parameters and with just 25 extra epoch trained on a dataset of less than 1000 which breaks the myth that all GAN need mammoth amount of data. Our network reduces the density of the Gan architecture by using Depthwise Separable Convolution which does the convolution operation on each of the RGB channels separately then we use a Point-Wise Convolution to bring back the network into required channel number using 1 by 1 kernel.This reduces the number of parameters substantially and makes it extreme light-weight and suitable for mobile phones and edge devices. The architecture mentioned in the present paper make use of Parameterised Batch Normalization Goodfellow etc al. (Deep Learning OPTIMIZATION FOR TRAINING DEEP MODELS page 320) which makes the network to use the advantage of Batch Norm for easier training while maintaining the non-linear feature capture by inducing the learnable parametersKeywords: comic stylisation from camera image using GAN, creating 2D animated movie style custom stickers from images, depth-wise separable convolutional neural network for light-weight GAN architecture for EDGE devices, GAN architecture for 2D animated cartoonizing neural style, neural style transfer for edge, model distilation, perceptual loss
Procedia PDF Downloads 13428645 A Study on Analysis of Magnetic Field in Induction Generator for Small Francis Turbine Generator
Authors: Young-Kwan Choi, Han-Sang Jeong, Yeon-Ho Ok, Jae-Ho Choi
Abstract:
The purpose of this study is to verify validity of design by testing output of induction generator through finite element analysis before manufacture of induction generator designed. Characteristics in the operating domain of induction generator can be understood through analysis of magnetic field according to load (rotational speed) of induction generator. Characteristics of induction generator such as induced voltage, current, torque, magnetic flux density (magnetic flux saturation), and loss can be predicted by analysis of magnetic field.Keywords: electromagnetic analysis, induction generator, small hydro power generator, small francis turbine generator
Procedia PDF Downloads 147628644 Human Factors Considerations in New Generation Fighter Planes to Enhance Combat Effectiveness
Authors: Chitra Rajagopal, Indra Deo Kumar, Ruchi Joshi, Binoy Bhargavan
Abstract:
Role of fighter planes in modern network centric military warfare scenarios has changed significantly in the recent past. New generation fighter planes have multirole capability of engaging both air and ground targets with high precision. Multirole aircraft undertakes missions such as Air to Air combat, Air defense, Air to Surface role (including Air interdiction, Close air support, Maritime attack, Suppression and Destruction of enemy air defense), Reconnaissance, Electronic warfare missions, etc. Designers have primarily focused on development of technologies to enhance the combat performance of the fighter planes and very little attention is given to human factor aspects of technologies. Unique physical and psychological challenges are imposed on the pilots to meet operational requirements during these missions. Newly evolved technologies have enhanced aircraft performance in terms of its speed, firepower, stealth, electronic warfare, situational awareness, and vulnerability reduction capabilities. This paper highlights the impact of emerging technologies on human factors for various military operations and missions. Technologies such as ‘cooperative knowledge-based systems’ to aid pilot’s decision making in military conflict scenarios as well as simulation technologies to enhance human performance is also studied as a part of research work. Current and emerging pilot protection technologies and systems which form part of the integrated life support systems in new generation fighter planes is discussed. System safety analysis application to quantify the human reliability in military operations is also studied.Keywords: combat effectiveness, emerging technologies, human factors, systems safety analysis
Procedia PDF Downloads 14328643 Study of Pottery And Glazed Canopic Vessels
Authors: Abdelrahman Mohamed
Abstract:
The ancient Egyptians used canopic vessels in embalming operations in order to preserve the guts of the mummified corpse. Canopic vessels were made of many materials, including pottery and glazed pottery. In this research, we study a pottery canopic vessel and a glazed pottery vessel. An analysis to find out the compounds and elements of the materials from which the container is made and the colors, and also to make some analysis for the organic materials present inside it, such as the Fourier Transform Infrared Spectroscopy analysis and the Gas chromatograph mass spectrometers analysis of the organic residue. Through the study and analysis, it was proved that some of the materials present in the pot were coniferous oil and animal fats. In the other pot, the analysis showed the presence of some plant resins (mastic) inside rolls of linen. Restoration operations were carried out, such as mechanical cleaning, strengthening, and completing the reinforcement of the pots.Keywords: canopic jar, embalming, FTIR, GCMS, linen.
Procedia PDF Downloads 8528642 Optimization Modeling of the Hybrid Antenna Array for the DoA Estimation
Authors: Somayeh Komeylian
Abstract:
The direction of arrival (DoA) estimation is the crucial aspect of the radar technologies for detecting and dividing several signal sources. In this scenario, the antenna array output modeling involves numerous parameters including noise samples, signal waveform, signal directions, signal number, and signal to noise ratio (SNR), and thereby the methods of the DoA estimation rely heavily on the generalization characteristic for establishing a large number of the training data sets. Hence, we have analogously represented the two different optimization models of the DoA estimation; (1) the implementation of the decision directed acyclic graph (DDAG) for the multiclass least-squares support vector machine (LS-SVM), and (2) the optimization method of the deep neural network (DNN) radial basis function (RBF). We have rigorously verified that the LS-SVM DDAG algorithm is capable of accurately classifying DoAs for the three classes. However, the accuracy and robustness of the DoA estimation are still highly sensitive to technological imperfections of the antenna arrays such as non-ideal array design and manufacture, array implementation, mutual coupling effect, and background radiation and thereby the method may fail in representing high precision for the DoA estimation. Therefore, this work has a further contribution on developing the DNN-RBF model for the DoA estimation for overcoming the limitations of the non-parametric and data-driven methods in terms of array imperfection and generalization. The numerical results of implementing the DNN-RBF model have confirmed the better performance of the DoA estimation compared with the LS-SVM algorithm. Consequently, we have analogously evaluated the performance of utilizing the two aforementioned optimization methods for the DoA estimation using the concept of the mean squared error (MSE).Keywords: DoA estimation, Adaptive antenna array, Deep Neural Network, LS-SVM optimization model, Radial basis function, and MSE
Procedia PDF Downloads 10128641 A Low-Cost Air Quality Monitoring Internet of Things Platform
Authors: Christos Spandonidis, Stefanos Tsantilas, Elias Sedikos, Nektarios Galiatsatos, Fotios Giannopoulos, Panagiotis Papadopoulos, Nikolaos Demagos, Dimitrios Reppas, Christos Giordamlis
Abstract:
In the present paper, a low cost, compact and modular Internet of Things (IoT) platform for air quality monitoring in urban areas is presented. This platform comprises of dedicated low cost, low power hardware and the associated embedded software that enable measurement of particles (PM2.5 and PM10), NO, CO, CO2 and O3 concentration in the air, along with relative temperature and humidity. This integrated platform acts as part of a greater air pollution data collecting wireless network that is able to monitor the air quality in various regions and neighborhoods of an urban area, by providing sensor measurements at a high rate that reaches up to one sample per second. It is therefore suitable for Big Data analysis applications such as air quality forecasts, weather forecasts and traffic prediction. The first real world test for the developed platform took place in Thessaloniki, Greece, where 16 devices were installed in various buildings in the city. In the near future, many more of these devices are going to be installed in the greater Thessaloniki area, giving a detailed air quality map of the city.Keywords: distributed sensor system, environmental monitoring, Internet of Things, smart cities
Procedia PDF Downloads 14728640 Devulcanization of Waste Rubber Tyre Utilizing Deep Eutectic Solvents and Ultrasonic Energy
Authors: Ricky Saputra, Rashmi Walvekar, Mohammad Khalid, Kaveh Shahbaz, Suganti Ramarad
Abstract:
This particular study of interest aims to study the effect of coupling ultrasonic treatment with eutectic solvents in devulcanization process of waste rubber tyre. Specifically, three different types of Deep Eutectic Solvents (DES) were utilized, namely ChCl:Urea (1:2), ChCl:ZnCl₂ (1:2) and ZnCl₂:urea (2:7) in which their physicochemical properties were analysed and proven to have permissible water content that is less than 3.0 wt%, degradation temperature below 200ᵒC and freezing point below 60ᵒC. The mass ratio of rubber to DES was varied from 1:20-1:40, sonicated for 1 hour at 37 kHz and heated at variable time of 5-30 min at 180ᵒC. Energy dispersive x-rays (EDX) results revealed that the first two DESs give the highest degree of sulphur removal at 74.44 and 76.69% respectively with optimum heating time at 15 minutes whereby if prolonged, reformation of crosslink network would be experienced. Such is supported by the evidence shown by both FTIR and FESEM results where di-sulfide peak reappears at 30 minutes and morphological structures from 15 to 30 minutes change from smooth with high voidage to rigid with low voidage respectively. Furthermore, TGA curve reveals similar phenomena whereby at 15 minutes thermal decomposition temperature is at the lowest due to the decrease of molecular weight as a result of sulphur removal but increases back at 30 minutes. Type of bond change was also analysed whereby it was found that only di-sulphide bond was cleaved and which indicates partial-devulcanization. Overall, the results show that DES has a great potential to be used as devulcanizing solvent.Keywords: crosslink network, devulcanization, eutectic solvents, reformation, ultrasonic
Procedia PDF Downloads 17328639 An Analysis of Twitter Use of Slow Food Movement in the Context of Online Activism
Authors: Kubra Sultan Yuzuncuyil, Aytekin İsman, Berkay Bulus
Abstract:
With the developments of information and communication technologies, the forms of molding public opinion have changed. In the presence of Internet, the notion of activism has been endowed with digital codes. Activists have engaged the use of Internet into their campaigns and the process of creating collective identity. Activist movements have been incorporating the relevance of new communication technologies for their goals and opposition. Creating and managing activism through Internet is called Online Activism. In this main, Slow Food Movement which was emerged within the philosophy of defending regional, fair and sustainable food has been engaging Internet into their activist campaign. This movement supports the idea that a new food system which allows strong connections between plate and planet is possible. In order to make their voices heard, it has utilized social networks and develop particular skills in the framework online activism. This study analyzes online activist skills of Slow Food Movement (SFM) develop and attempts to measure its effectiveness. To achieve this aim, it adopts the model proposed by Sivitandies and Shah and conduct both qualitiative and quantiative content analysis on social network use of Slow Food Movement. In this regard, the sample is chosen as the official profile and analyzed between in a three month period respectively March-May 2017. It was found that SFM develops particular techniques that appeal to the model of Sivitandies and Shah. The prominent skill in this regard was found as hyperlink abbreviation and use of multimedia elements. On the other hand, there are inadequacies in hashtag and interactivity use. The importance of this study is that it highlights and discusses how online activism can be engaged into a social movement. It also reveals current online activism skills of SFM and their effectiveness. Furthermore, it makes suggestions to enhance the related abilities and strengthen its voice on social networks.Keywords: slow food movement, Twitter, internet, online activism
Procedia PDF Downloads 28228638 Assessment and Analysis of Literary Criticism and Consumer Research
Authors: Mohammad Mirzaei
Abstract:
This article proposes literary criticism as a source of insight into consumer behavior, provides an extensive overview of literary criticism, provides concrete illustrative analysis, and offers suggestions for further research. To do, a literary analysis of advertising copy identifies elements that provide additional information to consumer researchers and discusses the contribution of literary criticism to consumer research. Important post-war critical schools of thought are reviewed, and relevant theoretical concepts are summarized. Ivory Flakes' advertisements are analyzed using a variety of concepts drawn from literary schools, primarily sociocultural and reader responses. Suggestions for further research on content analysis, image analysis, and consumption history are presented.Keywords: consumer behaviour, consumer research, consumption history, criticism
Procedia PDF Downloads 10328637 Building Energy Modeling for Networks of Data Centers
Authors: Eric Kumar, Erica Cochran, Zhiang Zhang, Wei Liang, Ronak Mody
Abstract:
The objective of this article was to create a modelling framework that exposes the marginal costs of shifting workloads across geographically distributed data-centers. Geographical distribution of internet services helps to optimize their performance for localized end users with lowered communications times and increased availability. However, due to the geographical and temporal effects, the physical embodiments of a service's data center infrastructure can vary greatly. In this work, we first identify that the sources of variances in the physical infrastructure primarily stem from local weather conditions, specific user traffic profiles, energy sources, and the types of IT hardware available at the time of deployment. Second, we create a traffic simulator that indicates the IT load at each data-center in the set as an approximator for user traffic profiles. Third, we implement a framework that quantifies the global level energy demands using building energy models and the traffic profiles. The results of the model provide a time series of energy demands that can be used for further life cycle analysis of internet services.Keywords: data-centers, energy, life cycle, network simulation
Procedia PDF Downloads 14828636 Understanding the Reasons for Flooding in Chennai and Strategies for Making It Flood Resilient
Authors: Nivedhitha Venkatakrishnan
Abstract:
Flooding in urban areas in India has become a usual ritual phenomenon and a nightmare to most cities, which is a consequence of man-made disruption resulting in disaster. The City planning in India falls short of withstanding hydro generated disasters. This has become a barrier and challenge in the process of development put forth by urbanization, high population density, expanding informal settlements, environment degradation from uncollected and untreated waste that flows into natural drains and water bodies, this has disrupted the natural mechanism of hazard protection such as drainage channels, wetlands and floodplains. The magnitude and the impact of the mishap was high because of the failure of development policies, strategies, plans that the city had adopted. In the current scenario, cities are becoming the home for future, with economic diversification bringing in more investment into cities especially in domains of Urban infrastructure, planning and design. The uncertainty of the Urban futures in these low elevated coastal zones faces an unprecedented risk and threat. The study on focuses on three major pillars of resilience such as Recover, Resist and Restore. This process of getting ready to handle the situation bridges the gap between disaster response management and risk reduction requires a shift in paradigm. The study involved a qualitative research and a system design approach (framework). The initial stages involved mapping out of the urban water morphology with respect to the spatial growth gave an insight of the water bodies that have gone missing over the years during the process of urbanization. The major finding of the study was missing links between traditional water harvesting network was a major reason resulting in a manmade disaster. The research conceptualized the ideology of a sponge city framework which would guide the growth through institutional frameworks at different levels. The next stage was on understanding the implementation process at various stage to ensure the shift in paradigm. Demonstration of the concepts at a neighborhood level where, how, what are the functions and benefits of each component. Quantifying the design decision with rainwater harvest, surface runoff and how much water is collected and how it could be collected, stored and reused. The study came with further recommendation for Water Mitigation Spaces that will revive the traditional harvesting network.Keywords: flooding, man made disaster, resilient city, traditional harvesting network, waterbodies
Procedia PDF Downloads 14128635 Thermodynamic Analysis of GT Cycle with Naphtha or Natural Gas as the Fuel: A Thermodynamic Comparison
Authors: S. Arpit, P. K. Das, S. K. Dash
Abstract:
In this paper, a comparative study is done between two fuels, naphtha and natural gas (NG), for a gas turbine (GT) plant of 32.5 MW with the same thermodynamic configuration. From the energy analysis, it is confirmed that the turbine inlet temperature (TIT) of the gas turbine in the case of natural gas is higher as compared to naphtha, and hence the isentropic efficiency of the turbine is better. The result from the exergy analysis also confirms that due to high turbine inlet temperature in the case of natural gas, exergy destruction in combustion chamber is less. But comparing two fuels for overall analysis, naphtha has higher energy and exergetic efficiency as compared to natural gas.Keywords: exergy analysis, gas turbine, naphtha, natural gas
Procedia PDF Downloads 21028634 A Highly Accurate Computer-Aided Diagnosis: CAD System for the Diagnosis of Breast Cancer by Using Thermographic Analysis
Authors: Mahdi Bazarganigilani
Abstract:
Computer-aided diagnosis (CAD) systems can play crucial roles in diagnosing crucial diseases such as breast cancer at the earliest. In this paper, a CAD system for the diagnosis of breast cancer was introduced and evaluated. This CAD system was developed by using spatio-temporal analysis of data on a set of consecutive thermographic images by employing wavelet transformation. By using this analysis, a very accurate machine learning model using random forest was obtained. The final results showed a promising accuracy of 91% in terms of the F1 measure indicator among 200 patients' sample data. The CAD system was further extended to obtain a detailed analysis of the effect of smaller sub-areas of each breast on the occurrence of cancer.Keywords: computer-aided diagnosis systems, thermographic analysis, spatio-temporal analysis, image processing, machine learning
Procedia PDF Downloads 212