Search results for: memory network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5746

Search results for: memory network

2656 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 147
2655 Survey Paper on Graph Coloring Problem and Its Application

Authors: Prateek Chharia, Biswa Bhusan Ghosh

Abstract:

Graph coloring is one of the prominent concepts in graph coloring. It can be defined as a coloring of the various regions of the graph such that all the constraints are fulfilled. In this paper various graphs coloring approaches like greedy coloring, Heuristic search for maximum independent set and graph coloring using edge table is described. Graph coloring can be used in various real time applications like student time tabling generation, Sudoku as a graph coloring problem, GSM phone network.

Keywords: graph coloring, greedy coloring, heuristic search, edge table, sudoku as a graph coloring problem

Procedia PDF Downloads 544
2654 The Impact of an Improved Strategic Partnership Programme on Organisational Performance and Growth of Firms in the Internet Protocol Television and Hybrid Fibre-Coaxial Broadband Industry

Authors: Collen T. Masilo, Brane Semolic, Pieter Steyn

Abstract:

The Internet Protocol Television (IPTV) and Hybrid Fibre-Coaxial (HFC) Broadband industrial sector landscape are rapidly changing and organisations within the industry need to stay competitive by exploring new business models so that they can be able to offer new services and products to customers. The business challenge in this industrial sector is meeting or exceeding high customer expectations across multiple content delivery modes. The increasing challenges in the IPTV and HFC broadband industrial sector encourage service providers to form strategic partnerships with key suppliers, marketing partners, advertisers, and technology partners. The need to form enterprise collaborative networks poses a challenge for any organisation in this sector, in selecting the right strategic partners who will ensure that the organisation’s services and products are marketed in new markets. Partners who will ensure that customers are efficiently supported by meeting and exceeding their expectations. Lastly, selecting cooperation partners who will represent the organisation in a positive manner, and contribute to improving the performance of the organisation. Companies in the IPTV and HFC broadband industrial sector tend to form informal partnerships with suppliers, vendors, system integrators and technology partners. Generally, partnerships are formed without thorough analysis of the real reason a company is forming collaborations, without proper evaluations of prospective partners using specific selection criteria, and with ineffective performance monitoring of partners to ensure that a firm gains real long term benefits from its partners and gains competitive advantage. Similar tendencies are illustrated in the research case study and are based on Skyline Communications, a global leader in end-to-end, multi-vendor network management and operational support systems (OSS) solutions. The organisation’s flagship product is the DataMiner network management platform used by many operators across multiple industries and can be referred to as a smart system that intelligently manages complex technology ecosystems for its customers in the IPTV and HFC broadband industry. The approach of the research is to develop the most efficient business model that can be deployed to improve a strategic partnership programme in order to significantly improve the performance and growth of organisations participating in a collaborative network in the IPTV and HFC broadband industrial sector. This involves proposing and implementing a new strategic partnership model and its main features within the industry which should bring about significant benefits for all involved companies to achieve value add and an optimal growth strategy. The proposed business model has been developed based on the research of existing relationships, value chains and business requirements in this industrial sector and validated in 'Skyline Communications'. The outputs of the business model have been demonstrated and evaluated in the research business case study the IPTV and HFC broadband service provider 'Skyline Communications'.

Keywords: growth, partnership, selection criteria, value chain

Procedia PDF Downloads 134
2653 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra

Authors: Bitewulign Mekonnen

Abstract:

Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.

Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network

Procedia PDF Downloads 95
2652 The Documentation of Modernisation Processes in Spain Based on the Residential Architecture of the 1960s. A Patrimonial Perspective on El Plantinar Neighbourhood in Seville

Authors: Julia Rey-Pérez, Julia Díaz Borrego

Abstract:

The modernisation process of the city of Sevilla in Spain and the transformation of the city took place through national and local government initiatives from the 1960s onwards. Part of these actions was the execution of numerous residential neighbourhoodsthat prepared Sevilla for the change of era. This process was possible thanks to the implementation of public policies that showed the imminent need for new architectural programmes, as well as for high-rise architecture built in reinforced concrete. However, very little is known to this day about the modernisation process in Sevilla and the development of these neighbourhoods, which were designed to house a large number of people and are today a key reference point in the Historic Urban Landscape of the city of Seville. Therefore, the present research aims to learn and reflect upon the urban transformation of the city at this time andto deepen the heritage uniqueness of these neighbourhoods, as is the case of ElPlantinarneighbourhood.The methodology proposed for this research is structured in three phases, where in the first stage, a general study of the El Plantinarneighbourhood was carried out on three scales: urban, object-typological and perceptive. In the second stage, the cultural attributes and values of the urban complex in question were identified in order to determine whether the case study is truly representative of the beginnings of modernity in Spain and whether it needs a heritage approach. Finally, a third phase is proposed in which criteria will be defined on how to intervene in this neighbourhood to guarantee its presence in the urban landscape of the city of Seville. The expected results will help to understand the process of modernisation that the city has undergone, as well as the heritage value of this architecture in the construction of the collective memory.

Keywords: modern heritage, urban obsolescence, methodology, develop

Procedia PDF Downloads 151
2651 Political Communication in Twitter Interactions between Government, News Media and Citizens in Mexico

Authors: Jorge Cortés, Alejandra Martínez, Carlos Pérez, Anaid Simón

Abstract:

The presence of government, news media, and general citizenry in social media allows considering interactions between them as a form of political communication (i.e. the public exchange of contradictory discourses about politics). Twitter’s asymmetrical following model (users can follow, mention or reply to other users that do not follow them) could foster alternative democratic practices and have an impact on Mexican political culture, which has been marked by a lack of direct communication channels between these actors. The research aim is to assess Twitter’s role in political communication practices through the analysis of interaction dynamics between government, news media, and citizens by extracting and visualizing data from Twitter’s API to observe general behavior patterns. The hypothesis is that regardless the fact that Twitter’s features enable direct and horizontal interactions between actors, users repeat traditional dynamics of interaction, without taking full advantage of the possibilities of this medium. Through an interdisciplinary team including Communication Strategies, Information Design, and Interaction Systems, the activity on Twitter generated by the controversy over the presence of Uber in Mexico City was analysed; an issue of public interest, involving aspects such as public opinion, economic interests and a legal dimension. This research includes techniques from social network analysis (SNA), a methodological approach focused on the comprehension of the relationships between actors through the visual representation and measurement of network characteristics. The analysis of the Uber event comprised data extraction, data categorization, corpus construction, corpus visualization and analysis. On the recovery stage TAGS, a Google Sheet template, was used to extract tweets that included the hashtags #UberSeQueda and #UberSeVa, posts containing the string Uber and tweets directed to @uber_mx. Using scripts written in Python, the data was filtered, discarding tweets with no interaction (replies, retweets or mentions) and locations outside of México. Considerations regarding bots and the omission of anecdotal posts were also taken into account. The utility of graphs to observe interactions of political communication in general was confirmed by the analysis of visualizations generated with programs such as Gephi and NodeXL. However, some aspects require improvements to obtain more useful visual representations for this type of research. For example, link¬crossings complicates following the direction of an interaction forcing users to manipulate the graph to see it clearly. It was concluded that some practices prevalent in political communication in Mexico are replicated in Twitter. Media actors tend to group together instead of interact with others. The political system tends to tweet as an advertising strategy rather than to generate dialogue. However, some actors were identified as bridges establishing communication between the three spheres, generating a more democratic exercise and taking advantage of Twitter’s possibilities. Although interactions in Twitter could become an alternative to political communication, this potential depends on the intentions of the participants and to what extent they are aiming for collaborative and direct communications. Further research is needed to get a deeper understanding on the political behavior of Twitter users and the possibilities of SNA for its analysis.

Keywords: interaction, political communication, social network analysis, Twitter

Procedia PDF Downloads 223
2650 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network

Procedia PDF Downloads 227
2649 Estimating Algae Concentration Based on Deep Learning from Satellite Observation in Korea

Authors: Heewon Jeong, Seongpyo Kim, Joon Ha Kim

Abstract:

Over the last few tens of years, the coastal regions of Korea have experienced red tide algal blooms, which are harmful and toxic to both humans and marine organisms due to their potential threat. It was accelerated owing to eutrophication by human activities, certain oceanic processes, and climate change. Previous studies have tried to monitoring and predicting the algae concentration of the ocean with the bio-optical algorithms applied to color images of the satellite. However, the accurate estimation of algal blooms remains problems to challenges because of the complexity of coastal waters. Therefore, this study suggests a new method to identify the concentration of red tide algal bloom from images of geostationary ocean color imager (GOCI) which are representing the water environment of the sea in Korea. The method employed GOCI images, which took the water leaving radiances centered at 443nm, 490nm and 660nm respectively, as well as observed weather data (i.e., humidity, temperature and atmospheric pressure) for the database to apply optical characteristics of algae and train deep learning algorithm. Convolution neural network (CNN) was used to extract the significant features from the images. And then artificial neural network (ANN) was used to estimate the concentration of algae from the extracted features. For training of the deep learning model, backpropagation learning strategy is developed. The established methods were tested and compared with the performances of GOCI data processing system (GDPS), which is based on standard image processing algorithms and optical algorithms. The model had better performance to estimate algae concentration than the GDPS which is impossible to estimate greater than 5mg/m³. Thus, deep learning model trained successfully to assess algae concentration in spite of the complexity of water environment. Furthermore, the results of this system and methodology can be used to improve the performances of remote sensing. Acknowledgement: This work was supported by the 'Climate Technology Development and Application' research project (#K07731) through a grant provided by GIST in 2017.

Keywords: deep learning, algae concentration, remote sensing, satellite

Procedia PDF Downloads 186
2648 Orthogonal Basis Extreme Learning Algorithm and Function Approximation

Authors: Ying Li, Yan Li

Abstract:

A new algorithm for single hidden layer feedforward neural networks (SLFN), Orthogonal Basis Extreme Learning (OBEL) algorithm, is proposed and the algorithm derivation is given in the paper. The algorithm can decide both the NNs parameters and the neuron number of hidden layer(s) during training while providing extreme fast learning speed. It will provide a practical way to develop NNs. The simulation results of function approximation showed that the algorithm is effective and feasible with good accuracy and adaptability.

Keywords: neural network, orthogonal basis extreme learning, function approximation

Procedia PDF Downloads 537
2647 Automatic Detection of Sugarcane Diseases: A Computer Vision-Based Approach

Authors: Himanshu Sharma, Karthik Kumar, Harish Kumar

Abstract:

The major problem in crop cultivation is the occurrence of multiple crop diseases. During the growth stage, timely identification of crop diseases is paramount to ensure the high yield of crops, lower production costs, and minimize pesticide usage. In most cases, crop diseases produce observable characteristics and symptoms. The Surveyors usually diagnose crop diseases when they walk through the fields. However, surveyor inspections tend to be biased and error-prone due to the nature of the monotonous task and the subjectivity of individuals. In addition, visual inspection of each leaf or plant is costly, time-consuming, and labour-intensive. Furthermore, the plant pathologists and experts who can often identify the disease within the plant according to their symptoms in early stages are not readily available in remote regions. Therefore, this study specifically addressed early detection of leaf scald, red rot, and eyespot types of diseases within sugarcane plants. The study proposes a computer vision-based approach using a convolutional neural network (CNN) for automatic identification of crop diseases. To facilitate this, firstly, images of sugarcane diseases were taken from google without modifying the scene, background, or controlling the illumination to build the training dataset. Then, the testing dataset was developed based on the real-time collected images from the sugarcane field from India. Then, the image dataset is pre-processed for feature extraction and selection. Finally, the CNN-based Visual Geometry Group (VGG) model was deployed on the training and testing dataset to classify the images into diseased and healthy sugarcane plants and measure the model's performance using various parameters, i.e., accuracy, sensitivity, specificity, and F1-score. The promising result of the proposed model lays the groundwork for the automatic early detection of sugarcane disease. The proposed research directly sustains an increase in crop yield.

Keywords: automatic classification, computer vision, convolutional neural network, image processing, sugarcane disease, visual geometry group

Procedia PDF Downloads 117
2646 Effectiveness of Visual Auditory Kinesthetic Tactile Technique on Reading Level among Dyslexic Children in Helikx Open School and Learning Centre, Salem

Authors: J. Mano Ranjini

Abstract:

Each and every child is special, born with a unique talent to explore this world. The word Dyslexia is derived from the Greek language in which “dys” meaning poor or inadequate and “lexis” meaning words or language. Dyslexia describes about a different kind of mind, which is often gifted and productive, that learns the concept differently. The main aim of the study is to bring the positive outcome of the reading level by examining the effectiveness of Visual Auditory Kinesthetic Tactile technique on Reading Level among Dyslexic Children at Helikx Open School and Learning Centre. A Quasi experimental one group pretest post test design was adopted for this study. The Reading Level was assessed by using the Schonell Graded Word Reading Test. Thirty subjects were drawn by using purposive sampling technique and the intervention Visual Auditory Kinesthetic Tactile technique was implemented to the Dyslexic Children for 30 consecutive days followed by the post Reading Level assessment revealed the improvement in the mean score value of reading level by 12%. Multi-sensory (VAKT) teaching uses all learning pathways in the brain (visual, auditory, kinesthetic-tactile) in order to enhance memory and learning and the ability in uplifting emotional, physical and societal dimensions. VAKT is an effective method to improve the reading skill of the Dyslexic Children that ensures the enormous significance of learning thereby influencing the wholesome of the child’s life.

Keywords: visual auditory kinesthetic tactile technique, reading level, dyslexic children, Helikx Open School

Procedia PDF Downloads 602
2645 Determination of the Walkability Comfort for Urban Green Space Using Geographical Information System

Authors: Muge Unal, Cengiz Uslu, Mehmet Faruk Altunkasa

Abstract:

Walkability relates to the ability of the places to connect people with varied destinations within a reasonable amount of time and effort, and to offer visual interest in journeys throughout the network. So, the good quality of the physical environment and arrangement of walkway and sidewalk appear to be more crucial in influencing the pedestrian route choice. Also, proximity, connectivity, and accessibility are significant factor for walkability in terms of an equal opportunity for using public spaces. As a result, there are two important points for walkability. Firstly, the place should have a well-planned street network for accessible and secondly facilitate the pedestrian need for comfort. In this respect, this study aims to examine the both physical and bioclimatic comfort levels of the current condition of pedestrian route with reference to design criteria of a street to access the urban green spaces. These aspects have been identified as the main indicators for walkable streets such as continuity, materials, slope, bioclimatic condition, walkway width, greenery, and surface. Additionally, the aim was to identify the factors that need to be considered in future guidelines and policies for planning and design in urban spaces especially streets. Adana city was chosen as a study area. Adana is a province of Turkey located in south-central Anatolia. This study workflow can be summarized in four stages: (1) environmental and physical data were collected by referred to literature and used in a weighted criteria method to determine the importance level of these data , (2) environmental characteristics of pedestrian routes gained from survey studies are evaluated to hierarchies these criteria of the collected information, (3) and then each pedestrian routes will have a score that provides comfortable access to the park, (4) finally, the comfortable routes to park will be mapped using GIS. It is hoped that this study will provide an insight into future development planning and design to create a friendly and more comfort street environment for the users.

Keywords: comfort level, geographical information system (GIS), walkability, weighted criteria method

Procedia PDF Downloads 313
2644 Intelligent Cooperative Integrated System for Road Safety and Road Infrastructure Maintenance

Authors: Panagiotis Gkekas, Christos Sougles, Dionysios Kehagias, Dimitrios Tzovaras

Abstract:

This paper presents the architecture of the “Intelligent cooperative integrated system for road safety and road infrastructure maintenance towards 2020” (ODOS2020) advanced infrastructure, which implements a number of cooperative ITS applications based on Internet of Things and Infrastructure-to-Vehicle (V2I) technologies with the purpose to enhance the active road safety level of vehicles through the provision of a fully automated V2I environment. The primary objective of the ODOS2020 project is to contribute to increased road safety but also to the optimization of time for maintenance of road infrastructure. The integrated technological solution presented in this paper addresses all types of vehicles and requires minimum vehicle equipment. Thus, the ODOS2020 comprises a low-cost solution, which is one of its main benefits. The system architecture includes an integrated notification system to transmit personalized information on road, traffic, and environmental conditions, in order for the drivers to receive real-time and reliable alerts concerning upcoming critical situations. The latter include potential dangers on the road, such as obstacles or road works ahead, extreme environmental conditions, etc., but also informative messages, such as information on upcoming tolls and their charging policies. At the core of the system architecture lies an integrated sensorial network embedded in special road infrastructures (strips) that constantly collect and transmit wirelessly information about passing vehicles’ identification, type, speed, moving direction and other traffic information in combination with environmental conditions and road wear monitoring and predictive maintenance data. Data collected from sensors is transmitted by roadside infrastructure, which supports a variety of communication technologies such as ITS-G5 (IEEE-802.11p) wireless network and Internet connectivity through cellular networks (3G, LTE). All information could be forwarded to both vehicles and Traffic Management Centers (TMC) operators, either directly through the ITS-G5 network, or to smart devices with Internet connectivity, through cloud-based services. Therefore, through its functionality, the system could send personalized notifications/information/warnings and recommendations for upcoming events to both road users and TMC operators. In the course of the ODOS2020 project pilot operation has been conducted to allow drivers of both C-ITS equipped and non-equipped vehicles to experience the provided added value services. For non-equipped vehicles, the provided information is transmitted to a smartphone application. Finally, the ODOS2020 system and infrastructure is appropriate for installation on both urban, rural, and highway environments. The paper presents the various parts of the system architecture and concludes by outlining the various challenges that had to be overcome during its design, development, and deployment in a real operational environment. Acknowledgments: Work presented in this paper was co-financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation (call RESEARCH–CREATE–INNOVATE) under contract no. Τ1EDK-03081 (project ODOS2020).

Keywords: infrastructure to vehicle, intelligent transportation systems, internet of things, road safety

Procedia PDF Downloads 126
2643 Encoding the Design of the Memorial Park and the Family Network as the Icon of 9/11 in Amy Waldman's the Submission

Authors: Masami Usui

Abstract:

After 9/11, the American literary scene was confronted with new perspectives that enabled both writers and readers to recognize the hidden aspects of their political, economic, legal, social, and cultural phenomena. There appeared an argument over new and challenging multicultural aspects after 9/11 and this argument is presented by a tension of space related to 9/11. In Amy Waldman’s the Submission (2011), designing both the memorial park and the family network has a significant meaning in establishing the progress of understanding from multiple perspectives. The most intriguing and controversial topic of racism is reflected in the Submission, where one young architect’s blind entry to the competition for the memorial of Ground Zero is nominated, yet he is confronted with strong objections and hostility as soon as he turns out to be a Muslim named Mohammad Khan. This ‘Khan’ issue, immediately enlarged into a social controversial issue on American soil, causes repeated acts of hostility to Muslim women by ignorant citizens all over America. His idea of the park is to design a new concept of tracing the cultural background of the open space. Against his will, his name is identified as the ‘ingredient’ of the networking of the resistant community with his supporters: on the other hand, the post 9/11 hysteria and victimization is presented in such family associations as the Angry Family Members and Grieving Family Members. These rapidly expanding networks, whether political or not, constructed by the internet, embody the contemporary societal connection and representation. The contemporary quest for the significance of human relationships is recognized as a quest for global peace. Designing both the memorial park and the communication networks strengthens a process of facing the shared conflicts and healing the survivors’ trauma. The tension between the idea and networking of the Garden for the memorial site and the collapse of Ground Zero signifies the double mission of the site: to establish the space to ease the wounded and to remember the catastrophe. Reading the design of these icons of 9/11 in the Submission means that decoding the myth of globalization and its representations in this century.

Keywords: American literature, cultural studies, globalization, literature of catastrophe

Procedia PDF Downloads 535
2642 Investigation of Delivery of Triple Play Service in GE-PON Fiber to the Home Network

Authors: Anurag Sharma, Dinesh Kumar, Rahul Malhotra, Manoj Kumar

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT

Procedia PDF Downloads 736
2641 Monitoring Large-Coverage Forest Canopy Height by Integrating LiDAR and Sentinel-2 Images

Authors: Xiaobo Liu, Rakesh Mishra, Yun Zhang

Abstract:

Continuous monitoring of forest canopy height with large coverage is essential for obtaining forest carbon stocks and emissions, quantifying biomass estimation, analyzing vegetation coverage, and determining biodiversity. LiDAR can be used to collect accurate woody vegetation structure such as canopy height. However, LiDAR’s coverage is usually limited because of its high cost and limited maneuverability, which constrains its use for dynamic and large area forest canopy monitoring. On the other hand, optical satellite images, like Sentinel-2, have the ability to cover large forest areas with a high repeat rate, but they do not have height information. Hence, exploring the solution of integrating LiDAR data and Sentinel-2 images to enlarge the coverage of forest canopy height prediction and increase the prediction repeat rate has been an active research topic in the environmental remote sensing community. In this study, we explore the potential of training a Random Forest Regression (RFR) model and a Convolutional Neural Network (CNN) model, respectively, to develop two predictive models for predicting and validating the forest canopy height of the Acadia Forest in New Brunswick, Canada, with a 10m ground sampling distance (GSD), for the year 2018 and 2021. Two 10m airborne LiDAR-derived canopy height models, one for 2018 and one for 2021, are used as ground truth to train and validate the RFR and CNN predictive models. To evaluate the prediction performance of the trained RFR and CNN models, two new predicted canopy height maps (CHMs), one for 2018 and one for 2021, are generated using the trained RFR and CNN models and 10m Sentinel-2 images of 2018 and 2021, respectively. The two 10m predicted CHMs from Sentinel-2 images are then compared with the two 10m airborne LiDAR-derived canopy height models for accuracy assessment. The validation results show that the mean absolute error (MAE) for year 2018 of the RFR model is 2.93m, CNN model is 1.71m; while the MAE for year 2021 of the RFR model is 3.35m, and the CNN model is 3.78m. These demonstrate the feasibility of using the RFR and CNN models developed in this research for predicting large-coverage forest canopy height at 10m spatial resolution and a high revisit rate.

Keywords: remote sensing, forest canopy height, LiDAR, Sentinel-2, artificial intelligence, random forest regression, convolutional neural network

Procedia PDF Downloads 96
2640 Cognitive Relaying in Interference Limited Spectrum Sharing Environment: Outage Probability and Outage Capacity

Authors: Md Fazlul Kader, Soo Young Shin

Abstract:

In this paper, we consider a cognitive relay network (CRN) in which the primary receiver (PR) is protected by peak transmit power $\bar{P}_{ST}$ and/or peak interference power Q constraints. In addition, the interference effect from the primary transmitter (PT) is considered to show its impact on the performance of the CRN. We investigate the outage probability (OP) and outage capacity (OC) of the CRN by deriving closed-form expressions over Rayleigh fading channel. Results show that both the OP and OC improve by increasing the cooperative relay nodes as well as when the PT is far away from the SR.

Keywords: cognitive relay, outage, interference limited, decode-and-forward (DF)

Procedia PDF Downloads 512
2639 Real-Time Demonstration of Visible Light Communication Based on Frequency-Shift Keying Employing a Smartphone as the Receiver

Authors: Fumin Wang, Jiaqi Yin, Lajun Wang, Nan Chi

Abstract:

In this article, we demonstrate a visible light communication (VLC) system over 8 meters free space transmission based on a commercial LED and a receiver in connection with an audio interface of a smart phone. The signal is in FSK modulation format. The successful experimental demonstration validates the feasibility of the proposed system in future wireless communication network.

Keywords: visible light communication, smartphone communication, frequency shift keying, wireless communication

Procedia PDF Downloads 394
2638 Fast Estimation of Fractional Process Parameters in Rough Financial Models Using Artificial Intelligence

Authors: Dávid Kovács, Bálint Csanády, Dániel Boros, Iván Ivkovic, Lóránt Nagy, Dalma Tóth-Lakits, László Márkus, András Lukács

Abstract:

The modeling practice of financial instruments has seen significant change over the last decade due to the recognition of time-dependent and stochastically changing correlations among the market prices or the prices and market characteristics. To represent this phenomenon, the Stochastic Correlation Process (SCP) has come to the fore in the joint modeling of prices, offering a more nuanced description of their interdependence. This approach has allowed for the attainment of realistic tail dependencies, highlighting that prices tend to synchronize more during intense or volatile trading periods, resulting in stronger correlations. Evidence in statistical literature suggests that, similarly to the volatility, the SCP of certain stock prices follows rough paths, which can be described using fractional differential equations. However, estimating parameters for these equations often involves complex and computation-intensive algorithms, creating a necessity for alternative solutions. In this regard, the Fractional Ornstein-Uhlenbeck (fOU) process from the family of fractional processes offers a promising path. We can effectively describe the rough SCP by utilizing certain transformations of the fOU. We employed neural networks to understand the behavior of these processes. We had to develop a fast algorithm to generate a valid and suitably large sample from the appropriate process to train the network. With an extensive training set, the neural network can estimate the process parameters accurately and efficiently. Although the initial focus was the fOU, the resulting model displayed broader applicability, thus paving the way for further investigation of other processes in the realm of financial mathematics. The utility of SCP extends beyond its immediate application. It also serves as a springboard for a deeper exploration of fractional processes and for extending existing models that use ordinary Wiener processes to fractional scenarios. In essence, deploying both SCP and fractional processes in financial models provides new, more accurate ways to depict market dynamics.

Keywords: fractional Ornstein-Uhlenbeck process, fractional stochastic processes, Heston model, neural networks, stochastic correlation, stochastic differential equations, stochastic volatility

Procedia PDF Downloads 120
2637 Anton Bruckner’s Requiem in Dm: The Reinterpretation of a Liturgical Genre in the Viennese Romantic Context

Authors: Sara Ramos Contioso

Abstract:

The premiere of Anton Bruckner's Requiem in Dm, in September 1849, represents a turning point in the composer's creative evolution. This Mass of the Dead, which was dedicated to the memory of his esteemed friend and mentor Franz Sailer, establishes the beginning of a new creative aesthetic in the composer´s production and links its liturgical development, which is contextualized in the monastery of St. Florian, to the use of a range of musicals possibilities that are projected by Bruckner on an orchestral texture with choir and organ. Set on a strict tridentine ritual model, this requiem exemplifies the religious aesthetics of a composer that is committed to the Catholic faith and that also links to its structure the reinterpretation of a religious model that, despite being romantic, shows a strong influence derived from the baroque or the Viennese Classicism language. Consequently, the study responds to the need to show the survival of the Requiem Mass within the romantic context of Vienna. Therefore, it draws on a detailed analysis of the score and the creative context of the composer with the intention of linking the work to the tradition of the genre and also specifying the stylistic particularities of its musical model within a variability of possibilities such as the contrasting precedents of Mozart, Haydn, Cherubini or Berlioz´s requiems. Tradition or modernity, liturgy or concert hall are aesthetic references that will condition the development of the Requiem Mass in the middle of the nineteenth century. In this context, this paper tries to recover Bruckner's Requiem in Dm as a musical model of the romantic ritual of deceased and as a stylistic reference of a creative composition that will condition the development of later liturgical works such as Liszt or DeLange (1868) ones.

Keywords: liturgy, religious symbolism, requiem, romanticism

Procedia PDF Downloads 339
2636 Using the Minnesota Multiphasic Personality Inventory-2 and Mini Mental State Examination-2 in Cognitive Behavioral Therapy: Case Studies

Authors: Cornelia-Eugenia Munteanu

Abstract:

From a psychological perspective, psychopathology is the area of clinical psychology that has at its core psychological assessment and psychotherapy. In day-to-day clinical practice, psychodiagnosis and psychotherapy are used independently, according to their intended purpose and their specific methods of application. The paper explores how the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and Mini Mental State Examination-2 (MMSE-2) psychological tools contribute to enhancing the effectiveness of cognitive behavioral psychotherapy (CBT). This combined approach, psychotherapy in conjunction with assessment of personality and cognitive functions, is illustrated by two cases, a severe depressive episode with psychotic symptoms and a mixed anxiety-depressive disorder. The order in which CBT, MMPI-2, and MMSE-2 were used in the diagnostic and therapeutic process was determined by the particularities of each case. In the first case, the sequence started with psychotherapy, followed by the administration of blue form MMSE-2, MMPI-2, and red form MMSE-2. In the second case, the cognitive screening with blue form MMSE-2 led to a personality assessment using MMPI-2, followed by red form MMSE-2; reapplication of the MMPI-2 due to the invalidation of the first profile, and finally, psychotherapy. The MMPI-2 protocols gathered useful information that directed the steps of therapeutic intervention: a detailed symptom picture of potentially self-destructive thoughts and behaviors otherwise undetected during the interview. The memory loss and poor concentration were confirmed by MMSE-2 cognitive screening. This combined approach, psychotherapy with psychological assessment, aligns with the trend of adaptation of the psychological services to the everyday life of contemporary man and paves the way for deepening and developing the field.

Keywords: assessment, cognitive behavioral psychotherapy, MMPI-2, MMSE-2, psychopathology

Procedia PDF Downloads 328
2635 Development of Monitoring Blood Bank Center Based PIC Microcontroller Using CAN Communication

Authors: Kaiwan S. Ismael, Ergun Ercelebi, Majeed Nader

Abstract:

This paper describes the design and implementation of a hardware setup for online monitoring of 24 refrigerators inside blood bank center using the microcontroller and CAN bus for communications between each node. Due to the security of locations in the blood bank hall and difficulty of monitoring of each refrigerator separately, this work proposes a solution to monitor all the blood bank refrigerators in one location. CAN-bus system is used because it has many applications and advantages, especially for this system due to easy in use, low cost, providing a reduction in wiring, fast to repair and easily expanding the project without a problem.

Keywords: control area network (CAN), monitoring blood bank center, PIC microcontroller, MPLAB IDE

Procedia PDF Downloads 486
2634 Cultural Works Interacting with the Generational Aesthetic Gap between Gen X and Gen Z in China: A Qualitative Study

Authors: Qianyu Zhang

Abstract:

The spread of digital technology in China has worsened the generation gap and intergenerational competition for cultural and aesthetic discourse. Meanwhile, the increased accessibility of cultural works has encouraged the sharing and inheritance of collective cultural memories between generations. However, not each cultural work can engage positively with efforts to bridge intergenerational aesthetic differences. This study argues that in contemporary China, where new media and the Internet are widely available, featured cultural works have more potential to help enhance the cultural aesthetic consensus among different generations, thus becoming an effective countermeasure to narrow the intergenerational aesthetic rift and cultural discontinuity. Specifically, the generational aesthetic gap is expected to be bridged or improved through the shared appreciation or consumption of cultural works that meet certain conditions by several generations. In-depth interviews of Gen X and Gen Z (N=15, respectively) in China uncovered their preferences and commonalities for cultural works and shared experiences in appreciating them. Results demonstrate that both generations’ shared appreciation of cultural work is a necessary but insufficient condition for its effective response to the generational aesthetic gap. Coding analysis rendered six dimensions that cultural works with the potential to bridge the intergenerational aesthetic divide should satisfy simultaneously: genre, theme, content, elements, quality, and accessibility. Cultural works that engage multiple senses/ compound realistic, domestic and contemporary cultural memories/ contain the narrative of family life and nationalism/ include more elements familiar to the previous generation/ are superb-produced and unaffected/ are more accessible better promote intergenerational aesthetic exchange and value recognition. Moreover, compared to the dilemma of the previous generation facing the aesthetic gap, the later generation plays a crucial role in bridging the generational aesthetic divide.

Keywords: cultural works, generation gap, generation X, generation Z, cultural memory

Procedia PDF Downloads 156
2633 Designing Garments Ergonomically to Improve Life Quality of Elderly People

Authors: Nagda Ibrahim Mady, Shimaa Mohamed Atiha

Abstract:

In light of actual needs of elderly people and the changes that accompany age in eyesight, hearing, dexterity, mobility, and memory which make aged people unable to carry out the simplest living affairs especially clothing demands. These needs are almost neglected in the current clothing market obligate aged peoples to wear the available choices without any consideration to their actual desires and needs. Fashion designer has gained many experiences that can gather between ergonomics and stages of fashion designing process. Fashion designer can determine the actual needs of aged people and reply these needs with designs that can achieve Improvement to the life quality of aged people besides maintaining good appearance. Thus Fashion designer can help elderly people to avoid negative impacts age leaves on them, either it is psychological or kinetic or that of dementia. Ergonomics in clothing is considered the tools and mechanisms that are used to fit aged people satisfactions supporting them to improve their living using the least time and effort. Providing the elderly with comfort besides maintaining good appearance that can make self–confidence besides independence. From this point of view the research is looking forward to improve the life of aged people through addressing functional clothes that can make elderly independent in the wearing process. Providing in these designs comfort, quality, and practicality and economic cost. Suggesting the suitable fabrics and materials and applying it to the designs to help the elderly perform their daily living customs. Reaching the successful designs that can be acceptable to specialists and to consumers whom they confirm: it supplies their clothing needs and provides the atheistic and functional performance and therefore it gives them better life.

Keywords: ergonomic, design garments, elderly people, life quality

Procedia PDF Downloads 569
2632 Long-Term Resilience Performance Assessment of Dual and Singular Water Distribution Infrastructures Using a Complex Systems Approach

Authors: Kambiz Rasoulkhani, Jeanne Cole, Sybil Sharvelle, Ali Mostafavi

Abstract:

Dual water distribution systems have been proposed as solutions to enhance the sustainability and resilience of urban water systems by improving performance and decreasing energy consumption. The objective of this study was to evaluate the long-term resilience and robustness of dual water distribution systems versus singular water distribution systems under various stressors such as demand fluctuation, aging infrastructure, and funding constraints. To this end, the long-term dynamics of these infrastructure systems was captured using a simulation model that integrates institutional agency decision-making processes with physical infrastructure degradation to evaluate the long-term transformation of water infrastructure. A set of model parameters that varies for dual and singular distribution infrastructure based on the system attributes, such as pipes length and material, energy intensity, water demand, water price, average pressure and flow rate, as well as operational expenditures, were considered and input in the simulation model. Accordingly, the model was used to simulate various scenarios of demand changes, funding levels, water price growth, and renewal strategies. The long-term resilience and robustness of each distribution infrastructure were evaluated based on various performance measures including network average condition, break frequency, network leakage, and energy use. An ecologically-based resilience approach was used to examine regime shifts and tipping points in the long-term performance of the systems under different stressors. Also, Classification and Regression Tree analysis was adopted to assess the robustness of each system under various scenarios. Using data from the City of Fort Collins, the long-term resilience and robustness of the dual and singular water distribution systems were evaluated over a 100-year analysis horizon for various scenarios. The results of the analysis enabled: (i) comparison between dual and singular water distribution systems in terms of long-term performance, resilience, and robustness; (ii) identification of renewal strategies and decision factors that enhance the long-term resiliency and robustness of dual and singular water distribution systems under different stressors.

Keywords: complex systems, dual water distribution systems, long-term resilience performance, multi-agent modeling, sustainable and resilient water systems

Procedia PDF Downloads 292
2631 Designing Space through Narratives: The Role of the Tour Description in the Architectural Design Process

Authors: A. Papadopoulou

Abstract:

When people are asked to provide an oral description of a space they usually provide a Tour description, which is a dynamic type of spatial narrative centered on the narrator’s body, rather than a Map description, which is a static type of spatial narrative focused on the organization of the space as seen from above. Also, subjects with training in the architecture discipline tend to adopt a Tour perspective of space when the narrative refers to a space they have actually experienced but tend to adopt a Map perspective when the narrative refers to a space they have merely imagined. This pilot study aims to investigate whether the Tour description, which is the most common mode in the oral descriptions of experienced space, is a cognitive perspective taken in the process of designing a space. The study investigates whether a spatial description provided by a subject with architecture training in the type of a Tour description would be accurately translated into a spatial layout by other subjects with architecture training. The subjects were given the Tour description in written form and were asked to make a plan drawing of the described space. The results demonstrate that when we conceive and design space we do not adopt the same rules and cognitive patterns that we adopt when we reconstruct space from our memory. As shown by the results of this pilot study, the rules that underlie the Tour description were not detected in the translation from narratives to drawings. In a different phase, the study also investigates how would subjects with architecture training describe space when forced to take a Tour perspective in their oral description of a space. The results of this second phase demonstrate that if intentionally taken, the Tour perspective leads to descriptions of space that are more detailed and focused on experiential aspects.

Keywords: architecture, design process, embodied cognition, map description, oral narratives, tour description

Procedia PDF Downloads 159
2630 Study on the Impact of Default Converter on the Quality of Energy Produced by DFIG Based Wind Turbine

Authors: N. Zerzouri, N. Benalia, N. Bensiali

Abstract:

This work is devoted to an analysis of the operation of a doubly fed induction generator (DFIG) integrated with a wind system. The power transfer between the stator and the network is carried out by acting on the rotor via a bidirectional signal converter. The analysis is devoted to the study of a fault in the converter due to an interruption of the control of a semiconductor. Simulation results obtained by the MATLAB/Simulink software illustrate the quality of the power generated at the default.

Keywords: doubly fed induction generator (DFIG), wind energy, PWM inverter, modeling

Procedia PDF Downloads 318
2629 Seamless Mobility in Heterogeneous Mobile Networks

Authors: Mohab Magdy Mostafa Mohamed

Abstract:

The objective of this paper is to introduce a vertical handover (VHO) algorithm between wireless LANs (WLANs) and LTE mobile networks. The proposed algorithm is based on the fuzzy control theory and takes into consideration power level, subscriber velocity, and target cell load instead of only power level in traditional algorithms. Simulation results show that network performance in terms of number of handovers and handover occurrence distance is improved.

Keywords: vertical handover, fuzzy control theory, power level, speed, target cell load

Procedia PDF Downloads 354
2628 Winning Consumers and Influencing Them Using Social Media: A Cross Generational Impact Case Study

Authors: J. Garfield, B. O'Hare, V. Bell

Abstract:

The use of social media is continuing to grow and is now widely used for product and service advertising. This research investigated the social media usage across all age ranges in the United Kingdom to determine the impact on purchasing habits. A questionnaire was distributed to people of different ages and with different experiences of social media usage. The results showed that Facebook continues to be the most popular social media network. Respondents in the younger age group were more likely to be influenced by brand marketing and advertising, but the study concluded that celebrity endorsements had little or no influence.

Keywords: social media advertising, social networking sites, electronic word of mouth, celebrity endorsements

Procedia PDF Downloads 132
2627 Deep Learning for SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo Ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring. SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, polarimetric SAR image, convolutional neural network, deep learnig, deep neural network

Procedia PDF Downloads 72