Search results for: decision based artificial neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32965

Search results for: decision based artificial neural network

28765 Robust Control of a Single-Phase Inverter Using Linear Matrix Inequality Approach

Authors: Chivon Choeung, Heng Tang, Panha Soth, Vichet Huy

Abstract:

This paper presents a robust control strategy for a single-phase DC-AC inverter with an output LC-filter. An all-pass filter is utilized to create an artificial β-signal so that the proposed controller can be simply used in dq-synchronous frame. The proposed robust controller utilizes a state feedback control with integral action in the dq-synchronous frame. A linear matrix inequality-based optimization scheme is used to determine stabilizing gains of the controllers to maximize the convergence rate to steady state in the presence of uncertainties. The uncertainties of the system are described as the potential variation range of the inductance and resistance in the LC-filter.

Keywords: single-phase inverter, linear matrix inequality, robust control, all-pass filter

Procedia PDF Downloads 124
28764 Identification of Groundwater Potential Zones Using Geographic Information System and Multi-Criteria Decision Analysis: A Case Study in Bagmati River Basin

Authors: Hritik Bhattarai, Vivek Dumre, Ananya Neupane, Poonam Koirala, Anjali Singh

Abstract:

The availability of clean and reliable groundwater is essential for the sustainment of human and environmental health. Groundwater is a crucial resource that contributes significantly to the total annual supply. However, over-exploitation has depleted groundwater availability considerably and led to some land subsidence. Determining the potential zone of groundwater is vital for protecting water quality and managing groundwater systems. Groundwater potential zones are marked with the assistance of Geographic Information System techniques. During the study, a standard methodology was proposed to determine groundwater potential using an integration of GIS and AHP techniques. When choosing the prospective groundwater zone, accurate information was generated to get parameters such as geology, slope, soil, temperature, rainfall, drainage density, and lineament density. However, identifying and mapping potential groundwater zones remains challenging due to aquifer systems' complex and dynamic nature. Then, ArcGIS was incorporated with a weighted overlay, and appropriate ranks were assigned to each parameter group. Through data analysis, MCDA was applied to weigh and prioritize the different parameters based on their relative impact on groundwater potential. There were three probable groundwater zones: low potential, moderate potential, and high potential. Our analysis showed that the central and lower parts of the Bagmati River Basin have the highest potential, i.e., 7.20% of the total area. In contrast, the northern and eastern parts have lower potential. The identified potential zones can be used to guide future groundwater exploration and management strategies in the region.

Keywords: groundwater, geographic information system, analytic hierarchy processes, multi-criteria decision analysis, Bagmati

Procedia PDF Downloads 86
28763 Glossematics and Textual Structure

Authors: Abdelhadi Nadjer

Abstract:

The structure of the text to the systemic school -(glossématique-Helmslev). At the beginning of the note we have a cursory look around the concepts of general linguistics The science that studies scientific study of human language based on the description and preview the facts away from the trend of education than we gave a detailed overview the founder of systemic school and most important customers and more methods and curriculum theory and analysis they extend to all humanities, practical action each offset by a theoretical and the procedure can be analyzed through the elements that pose as another method we talked to its links with other language schools where they are based on the sharp criticism of the language before and deflected into consideration for the field of language and its erection has outside or language network and its participation in the actions (non-linguistic) and after that we started our Valglosamatik analytical structure of the text is ejected text terminal or all of the words to was put for expression. This text Negotiable divided into types in turn are divided into classes and class should not be carrying a contradiction and be inclusive. It is on the same materials as described relationships that combine language and seeks to describe their relations and identified.

Keywords: text, language schools, linguistics, human language

Procedia PDF Downloads 439
28762 Towards a Security Model against Denial of Service Attacks for SIP Traffic

Authors: Arellano Karina, Diego Avila-Pesántez, Leticia Vaca-Cárdenas, Alberto Arellano, Carmen Mantilla

Abstract:

Nowadays, security threats in Voice over IP (VoIP) systems are an essential and latent concern for people in charge of security in a corporate network, because, every day, new Denial-of-Service (DoS) attacks are developed. These affect the business continuity of an organization, regarding confidentiality, availability, and integrity of services, causing frequent losses of both information and money. The purpose of this study is to establish the necessary measures to mitigate DoS threats, which affect the availability of VoIP systems, based on the Session Initiation Protocol (SIP). A Security Model called MS-DoS-SIP is proposed, which is based on two approaches. The first one analyzes the recommendations of international security standards. The second approach takes into account weaknesses and threats. The implementation of this model in a VoIP simulated system allowed to minimize the present vulnerabilities in 92% and increase the availability time of the VoIP service into an organization.

Keywords: Denial-of-Service SIP attacks, MS-DoS-SIP, security model, VoIP-SIP vulnerabilities

Procedia PDF Downloads 185
28761 Present Status, Driving Forces and Pattern Optimization of Territory in Hubei Province, China

Authors: Tingke Wu, Man Yuan

Abstract:

“National Territorial Planning (2016-2030)” was issued by the State Council of China in 2017. As an important initiative of putting it into effect, territorial planning at provincial level makes overall arrangement of territorial development, resources and environment protection, comprehensive renovation and security system construction. Hubei province, as the pivot of the “Rise of Central China” national strategy, is now confronted with great opportunities and challenges in territorial development, protection, and renovation. Territorial spatial pattern experiences long time evolution, influenced by multiple internal and external driving forces. It is not clear what are the main causes of its formation and what are effective ways of optimizing it. By analyzing land use data in 2016, this paper reveals present status of territory in Hubei. Combined with economic and social data and construction information, driving forces of territorial spatial pattern are then analyzed. Research demonstrates that the three types of territorial space aggregate distinctively. The four aspects of driving forces include natural background which sets the stage for main functions, population and economic factors which generate agglomeration effect, transportation infrastructure construction which leads to axial expansion and significant provincial strategies which encourage the established path. On this basis, targeted strategies for optimizing territory spatial pattern are then put forward. Hierarchical protection pattern should be established based on development intensity control as respect for nature. By optimizing the layout of population and industry and improving the transportation network, polycentric network-based development pattern could be established. These findings provide basis for Hubei Territorial Planning, and reference for future territorial planning in other provinces.

Keywords: driving forces, Hubei, optimizing strategies, spatial pattern, territory

Procedia PDF Downloads 89
28760 Adaptive Routing in NoC-Based Heterogeneous MPSoCs

Authors: M. K. Benhaoua, A. E. H. Benyamina, T. Djeradi, P. Boulet

Abstract:

In this paper, we propose adaptive routing that considers the routing of communications in order to optimize the overall performance. The routing technique uses a newly proposed Algorithm to route communications between the tasks. The routing we propose of the communications leads to a better optimization of several performance metrics (time and energy consumption). Experimental results show that the proposed routing approach provides significant performance improvements when compared to those using static routing.

Keywords: multi-processor systems-on-chip (mpsocs), network-on-chip (noc), heterogeneous architectures, adaptive routin

Procedia PDF Downloads 360
28759 A Study of the Establishment of the Evaluation Index System for Tourist Attraction Disaster Resilience

Authors: Chung-Hung Tsai, Ya-Ping Li

Abstract:

Tourism industry is highly depended on the natural environment and climate. Compared to other industries, it is more susceptible to environment and climate. Taiwan belongs to a sea island country and located in the subtropical monsoon zone. The events of climate variability, frequency of typhoons and rainfalls raged are caused regularly serious disaster. In traditional disaster assessment, it usually focuses on the disaster damage and risk assessment, which is short of the features from different industries to understand the impact of the restoring force in post-disaster resilience and the main factors that constitute resilience. The object of this study is based on disaster recovery experience of tourism area and to understand the main factors affecting the tourist area of disaster resilience. The combinations of literature review and interviews with experts are prepared an early indicator system of the disaster resilience. Then, it is screened through a Fuzzy Delphi Method and Analytic Network Process for weight analysis. Finally, this study will establish the tourism disaster resilience evaluation index system considering the Taiwan's tourism industry characteristics. We hope that be able to enhance disaster resilience after tourist areas and increases the sustainability of industrial development. It is expected to provide government departments the tourism industry as the future owner of the assets in extreme climates responses.

Keywords: resilience, Fuzzy Delphi Method, Analytic Network Process, industrial development

Procedia PDF Downloads 387
28758 Intelligent Fishers Harness Aquatic Organisms and Climate Change

Authors: Shih-Fang Lo, Tzu-Wei Guo, Chih-Hsuan Lee

Abstract:

Tropical fisheries are vulnerable to the physical and biogeochemical oceanic changes associated with climate change. Warmer temperatures and extreme weather have beendamaging the abundance and growth patterns of aquatic organisms. In recent year, the shrinking of fish stock and labor shortage have increased the threat to global aquacultural production. Thus, building a climate-resilient and sustainable mechanism becomes an urgent, important task for global citizens. To tackle the problem, Taiwanese fishermen applies the artificial intelligence (AI) technology. In brief, the AI system (1) measures real-time water quality and chemical parameters infish ponds; (2) monitors fish stock through segmentation, detection, and classification; and (3) implements fishermen’sprevious experiences, perceptions, and real-life practices. Applying this system can stabilize the aquacultural production and potentially increase the labor force. Furthermore, this AI technology can build up a more resilient and sustainable system for the fishermen so that they can mitigate the influence of extreme weather while maintaining or even increasing their aquacultural production. In the future, when the AI system collected and analyzed more and more data, it can be applied to different regions of the world or even adapt to the future technological or societal changes, continuously providing the most relevant and useful information for fishermen in the world.

Keywords: aquaculture, artificial intelligence (AI), real-time system, sustainable fishery

Procedia PDF Downloads 100
28757 Improved K-Means Clustering Algorithm Using RHadoop with Combiner

Authors: Ji Eun Shin, Dong Hoon Lim

Abstract:

Data clustering is a common technique used in data analysis and is used in many applications, such as artificial intelligence, pattern recognition, economics, ecology, psychiatry and marketing. K-means clustering is a well-known clustering algorithm aiming to cluster a set of data points to a predefined number of clusters. In this paper, we implement K-means algorithm based on MapReduce framework with RHadoop to make the clustering method applicable to large scale data. RHadoop is a collection of R packages that allow users to manage and analyze data with Hadoop. The main idea is to introduce a combiner as a function of our map output to decrease the amount of data needed to be processed by reducers. The experimental results demonstrated that K-means algorithm using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also showed that our K-means algorithm using RHadoop with combiner was faster than regular algorithm without combiner as the size of data set increases.

Keywords: big data, combiner, K-means clustering, RHadoop

Procedia PDF Downloads 415
28756 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm

Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali

Abstract:

Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.

Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir

Procedia PDF Downloads 251
28755 Surface Modified Quantum Dots for Nanophotonics, Stereolithography and Hybrid Systems for Biomedical Studies

Authors: Redouane Krini, Lutz Nuhn, Hicham El Mard Cheol Woo Ha, Yoondeok Han, Kwang-Sup Lee, Dong-Yol Yang, Jinsoo Joo, Rudolf Zentel

Abstract:

To use Quantum Dots (QDs) in the two photon initiated polymerization technique (TPIP) for 3D patternings, QDs were modified on the surface with photosensitive end groups which are able to undergo a photopolymerization. We were able to fabricate fluorescent 3D lattice structures using photopatternable QDs by TPIP for photonic devices such as photonic crystals and metamaterials. The QDs in different diameter have different emission colors and through mixing of RGB QDs white light fluorescent from the polymeric structures has been created. Metamaterials are capable for unique interaction with the electrical and magnetic components of the electromagnetic radiation and for manipulating light it is crucial to have a negative refractive index. In combination with QDs via TPIP technique polymeric structures can be designed with properties which cannot be found in nature. This makes these artificial materials gaining a huge importance for real-life applications in photonic and optoelectronic. Understanding of interactions between nanoparticles and biological systems is of a huge interest in the biomedical research field. We developed a synthetic strategy of polymer functionalized nanoparticles for biomedical studies to obtain hybrid systems of QDs and copolymers with a strong binding network in an inner shell and which can be modified in the end through their poly(ethylene glycol) functionalized outer shell. These hybrid systems can be used as models for investigation of cell penetration and drug delivery by using measurements combination between CryoTEM and fluorescence studies.

Keywords: biomedical study models, lithography, photo induced polymerization, quantum dots

Procedia PDF Downloads 511
28754 Factors Influencing University Student's Acceptance of New Technology

Authors: Fatma Khadra

Abstract:

The objective of this research is to identify the acceptance of new technology in a sample of 150 Participants from Qatar University. Based on the Technology Acceptance Model (TAM), we used the Davis’s scale (1989) which contains two item scales for Perceived Usefulness and Perceived Ease of Use. The TAM represents an important theoretical contribution toward understanding how users come to accept and use technology. This model suggests that when people are presented with a new technology, a number of variables influence their decision about how and when they will use it. The results showed that participants accept more technology because flexibility, clarity, enhancing the experience, enjoying, facility, and useful. Also, results showed that younger participants accept more technology than others.

Keywords: new technology, perceived usefulness, perceived ease of use, technology acceptance model

Procedia PDF Downloads 302
28753 Hierarchical Queue-Based Task Scheduling with CloudSim

Authors: Wanqing You, Kai Qian, Ying Qian

Abstract:

The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.

Keywords: hierarchical queue, load balancing, CloudSim, information technology

Procedia PDF Downloads 405
28752 Deep Reinforcement Learning for Optimal Decision-Making in Supply Chains

Authors: Nitin Singh, Meng Ling, Talha Ahmed, Tianxia Zhao, Reinier van de Pol

Abstract:

We propose the use of reinforcement learning (RL) as a viable alternative for optimizing supply chain management, particularly in scenarios with stochasticity in product demands. RL’s adaptability to changing conditions and its demonstrated success in diverse fields of sequential decision-making makes it a promising candidate for addressing supply chain problems. We investigate the impact of demand fluctuations in a multi-product supply chain system and develop RL agents with learned generalizable policies. We provide experimentation details for training RL agents and statistical analysis of the results. We study the generalization ability of RL agents for different demand uncertainty scenarios and observe superior performance compared to the agents trained with fixed demand curves. The proposed methodology has the potential to lead to cost reduction and increased profit for companies dealing with frequent inventory movement between supply and demand nodes.

Keywords: inventory management, reinforcement learning, supply chain optimization, uncertainty

Procedia PDF Downloads 91
28751 Ocean Planner: A Web-Based Decision Aid to Design Measures to Best Mitigate Underwater Noise

Authors: Thomas Folegot, Arnaud Levaufre, Léna Bourven, Nicolas Kermagoret, Alexis Caillard, Roger Gallou

Abstract:

Concern for negative impacts of anthropogenic noise on the ocean’s ecosystems has increased over the recent decades. This concern leads to a similar increased willingness to regulate noise-generating activities, of which shipping is one of the most significant. Dealing with ship noise requires not only knowledge about the noise from individual ships, but also how the ship noise is distributed in time and space within the habitats of concern. Marine mammals, but also fish, sea turtles, larvae and invertebrates are mostly dependent on the sounds they use to hunt, feed, avoid predators, during reproduction to socialize and communicate, or to defend a territory. In the marine environment, sight is only useful up to a few tens of meters, whereas sound can propagate over hundreds or even thousands of kilometers. Directive 2008/56/EC of the European Parliament and of the Council of June 17, 2008 called the Marine Strategy Framework Directive (MSFD) require the Member States of the European Union to take the necessary measures to reduce the impacts of maritime activities to achieve and maintain a good environmental status of the marine environment. The Ocean-Planner is a web-based platform that provides to regulators, managers of protected or sensitive areas, etc. with a decision support tool that enable to anticipate and quantify the effectiveness of management measures in terms of reduction or modification the distribution of underwater noise, in response to Descriptor 11 of the MSFD and to the Marine Spatial Planning Directive. Based on the operational sound modelling tool Quonops Online Service, Ocean-Planner allows the user via an intuitive geographical interface to define management measures at local (Marine Protected Area, Natura 2000 sites, Harbors, etc.) or global (Particularly Sensitive Sea Area) scales, seasonal (regulation over a period of time) or permanent, partial (focused to some maritime activities) or complete (all maritime activities), etc. Speed limit, exclusion area, traffic separation scheme (TSS), and vessel sound level limitation are among the measures supported be the tool. Ocean Planner help to decide on the most effective measure to apply to maintain or restore the biodiversity and the functioning of the ecosystems of the coastal seabed, maintain a good state of conservation of sensitive areas and maintain or restore the populations of marine species.

Keywords: underwater noise, marine biodiversity, marine spatial planning, mitigation measures, prediction

Procedia PDF Downloads 106
28750 Authentic Connection between the Deity and the Individual Human Being Is Vital for Psychological, Biological, and Social Health

Authors: Sukran Karatas

Abstract:

Authentic energy network interrelations between the Creator and the creations as well as from creations to creations are the most important points for the worlds of physics and metaphysic to unite together and work in harmony, both within human beings, on the other hand, have the ability to choose their own life style voluntarily. However, it includes the automated involuntary spirit, soul and body working systems together with the voluntary actions, which involve personal, cultural and universal, rational or irrational variable values. Therefore, it is necessary for human beings to know the methods of existing authentic energy network connections to be able to communicate correlate and accommodate the physical and metaphysical entities as a proper functioning unity; this is essential for complete human psychological, biological and social well-being. Authentic knowledge is necessary for human beings to verify the position of self within self and with others to regulate conscious and voluntary actions accordingly in order to prevent oppressions and frictions within self and between self and others. Unfortunately, the absence of genuine individual and universal basic knowledge about how to establish an authentic energy network connection within self, with the deity and the environment is the most problematic issue even in the twenty-first century. The second most problematic issue is how to maintain freedom, equality and justice among human beings during these strictly interwoven network connections, which naturally involve physical, metaphysical and behavioral actions of the self and the others. The third and probably the most complicated problem is the scientific identification and the authentication of the deity. This not only provides the whole power and control over the choosers to set their life orders but also to establish perfect physical and metaphysical links as fully coordinated functional energy network. This thus indicates that choosing an authentic deity is the key-point that influences automated, emotional, and behavioral actions altogether, which shapes human perception, personal actions, and life orders. Therefore, we will be considering the existing ‘four types of energy wave end boundary behaviors’, comprising, free end, fixed end boundary behaviors, as well as boundary behaviors from denser medium to less dense medium and from less dense medium to denser medium. Consequently, this article aims to demonstrate that the authentication and the choice of deity has an important effect on individual psychological, biological and social health. It is hoped that it will encourage new researches in the field of authentic energy network connections to establish the best position and the most correct interrelation connections with self and others without violating the authorized orders and the borders of one another to live happier and healthier lives together. In addition, the book ‘Deity and Freedom, Equality, Justice in History, Philosophy, Science’ has more detailed information for those interested in this subject.

Keywords: deity, energy network, power, freedom, equality, justice, happiness, sadness, hope, fear, psychology, biology, sociology

Procedia PDF Downloads 334
28749 Three-Dimensional Carbon Foam Based Asymmetric Assembly of Metal Oxides Electrodes for High-Performance Solid-State Micro-Supercapacitor

Authors: Sumana Kumar, Abha Misra

Abstract:

Micro-supercapacitors hold great attention as one of the promising energy storage devices satisfying the increasing quest for miniaturized and portable devices. Despite having impressive power density, superior cyclic lifetime, and high charge-discharge rates, micro-supercapacitors still suffer from low energy density, which limits their practical application. The energy density (E=1/2CV²) can be increased either by increasing specific capacitance (C) or voltage range (V). Asymmetric micro-supercapacitors have attracted great attention by using two different electrode materials to expand the voltage window and thus increase the energy density. Currently, versatile fabrication technologies such as inkjet printing, lithography, laser scribing, etc., are used to directly or indirectly pattern the electrode material; these techniques still suffer from scalable production and cost inefficiency. Here, we demonstrate the scalable production of a three-dimensional (3D) carbon foam (CF) based asymmetric micro-supercapacitor by spray printing technique on an array of interdigital electrodes. The solid-state asymmetric micro-supercapacitor comprised of CF-MnO positive electrode and CF-Fe₂O₃ negative electrode achieves a high areal capacitance of 18.4 mF/cm² (2326.8 mF/cm³) at 5 mV/s and a wider potential window of 1.4 V. Consequently, a superior energy density of 5 µWh/cm² is obtained, and high cyclic stability is confirmed with retention of the initial capacitance by 86.1% after 10000 electrochemical cycles. The optimized decoration of pseudocapacitive metal oxides in the 3D carbon network helps in high electrochemical utilization of materials where the 3D interconnected network of carbon provides overall electrical conductivity and structural integrity. The research provides a simple and scalable spray printing method to fabricate an asymmetric micro-supercapacitor using a custom-made mask that can be integrated on a large scale.

Keywords: asymmetric micro-supercapacitors, high energy-density, hybrid materials, three-dimensional carbon-foam

Procedia PDF Downloads 102
28748 An Activity Based Trajectory Search Approach

Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar

Abstract:

With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.

Keywords: location based recommendation, map-reduce, recommendation system, trajectory search

Procedia PDF Downloads 206
28747 Evaluation of Urban Parks Based on POI Data: Taking Futian District of Shenzhen as an Example

Authors: Juanling Lin

Abstract:

The construction of urban parks is an important part of eco-city construction, and the intervention of big data provides a more scientific and rational platform for the assessment of urban parks by identifying and correcting the irrationality of urban park planning from the macroscopic level and then promoting the rational planning of urban parks. The study builds an urban park assessment system based on urban road network data and POI data, taking Futian District of Shenzhen as the research object, and utilizes the GIS geographic information system to assess the park system of Futian District in five aspects: park spatial distribution, accessibility, service capacity, demand, and supply-demand relationship. The urban park assessment system can effectively reflect the current situation of urban park construction and provide a useful exploration for realizing the rationality and fairness of urban park planning.

Keywords: urban parks, assessment system, POI, supply and demand

Procedia PDF Downloads 28
28746 Rethinking Urban Floodplain Management: The Case of Colombo, Sri Lanka

Authors: Malani Herath, Sohan Wijesekera, Jagath Munasingha

Abstract:

The impact of recent floods become significant, and the extraordinary flood events cause considerable damage to lives, properties, environment and negatively affect the whole development of Colombo urban region. Even though the Colombo urban region experiences recurrent flood impacts, several spatial planning interventions have been taken from time to time since early 20th century. All past plans have adopted a traditional approach to flood management, using infrastructural measures to reduce the chance of flooding together with rigid planning regulations. The existing flood risk management practices do not operate to be acceptable by the local community particular the urban poor. Researchers have constantly reported the differences in estimations of flood risk, priorities, concerns of experts and the local community. Risk-based decision making in flood management is not only a matter of technical facts; it has a significant bearing on how flood risk is viewed by local community and individuals. Moreover, sustainable flood management is an integrated approach, which highlights joint actions of experts and community. This indicates the necessity of further societal discussion on the acceptable level of flood risk indicators to prioritize and identify the appropriate flood management measures in Colombo. The understanding and evaluation of flood risk by local people are important to integrate in the decision-making process. This research questioned about the gap between the acceptable level of flood risk to spatial planners and to the local communities in Colombo. A comprehensive literature review was conducted to prepare a framework to analyze the public perception in Colombo. This research work identifies the factors that affect the variation of flood risk and acceptable levels to both local community and planning authorities.

Keywords: Colombo basin, public perception, urban flood risk, multi-criteria analysis

Procedia PDF Downloads 297
28745 Game of Funds: Efficiency and Policy Implications of the United Kingdom Research Excellence Framework

Authors: Boon Lee

Abstract:

Research publication is an essential output of universities because it not only promotes university recognition, it also receives government funding. The history of university research culture has been one of ‘publish or perish’ and universities have consistently encouraged their academics and researchers to produce research articles in reputable journals in order to maintain a level of competitiveness. In turn, the United Kingdom (UK) government funding is determined by the number and quality of research publications. This paper aims to investigate on whether more government funding leads to more quality papers. To that end, the paper employs a Network DEA model to evaluate the UK higher education performance over a period. Sources of efficiency are also determined via second stage regression analysis.

Keywords: efficiency, higher education, network data envelopment analysis, universities

Procedia PDF Downloads 105
28744 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception

Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu

Abstract:

Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.

Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish

Procedia PDF Downloads 124
28743 Web Development in Information Technology with Javascript, Machine Learning and Artificial Intelligence

Authors: Abdul Basit Kiani, Maryam Kiani

Abstract:

Online developers now have the tools necessary to create online apps that are not only reliable but also highly interactive, thanks to the introduction of JavaScript frameworks and APIs. The objective is to give a broad overview of the recent advances in the area. The fusion of machine learning (ML) and artificial intelligence (AI) has expanded the possibilities for web development. Modern websites now include chatbots, clever recommendation systems, and customization algorithms built in. In the rapidly evolving landscape of modern websites, it has become increasingly apparent that user engagement and personalization are key factors for success. To meet these demands, websites now incorporate a range of innovative technologies. One such technology is chatbots, which provide users with instant assistance and support, enhancing their overall browsing experience. These intelligent bots are capable of understanding natural language and can answer frequently asked questions, offer product recommendations, and even help with troubleshooting. Moreover, clever recommendation systems have emerged as a powerful tool on modern websites. By analyzing user behavior, preferences, and historical data, these systems can intelligently suggest relevant products, articles, or services tailored to each user's unique interests. This not only saves users valuable time but also increases the chances of conversions and customer satisfaction. Additionally, customization algorithms have revolutionized the way websites interact with users. By leveraging user preferences, browsing history, and demographic information, these algorithms can dynamically adjust the website's layout, content, and functionalities to suit individual user needs. This level of personalization enhances user engagement, boosts conversion rates, and ultimately leads to a more satisfying online experience. In summary, the integration of chatbots, clever recommendation systems, and customization algorithms into modern websites is transforming the way users interact with online platforms. These advanced technologies not only streamline user experiences but also contribute to increased customer satisfaction, improved conversions, and overall website success.

Keywords: Javascript, machine learning, artificial intelligence, web development

Procedia PDF Downloads 60
28742 A Blockchain-Based Privacy-Preserving Physical Delivery System

Authors: Shahin Zanbaghi, Saeed Samet

Abstract:

The internet has transformed the way we shop. Previously, most of our purchases came in the form of shopping trips to a nearby store. Now, it’s as easy as clicking a mouse. But with great convenience comes great responsibility. We have to be constantly vigilant about our personal information. In this work, our proposed approach is to encrypt the information printed on the physical packages, which include personal information in plain text, using a symmetric encryption algorithm; then, we store that encrypted information into a Blockchain network rather than storing them in companies or corporations centralized databases. We present, implement and assess a blockchain-based system using Ethereum smart contracts. We present detailed algorithms that explain the details of our smart contract. We present the security, cost, and performance analysis of the proposed method. Our work indicates that the proposed solution is economically attainable and provides data integrity, security, transparency, and data traceability.

Keywords: blockchain, Ethereum, smart contract, commit-reveal scheme

Procedia PDF Downloads 135
28741 Knowledge Management Strategies within a Corporate Environment of Papers

Authors: Daniel J. Glauber

Abstract:

Knowledge transfer between personnel could benefit an organization’s improved competitive advantage in the marketplace from a strategic approach to knowledge management. The lack of information sharing between personnel could create knowledge transfer gaps while restricting the decision-making processes. Knowledge transfer between personnel can potentially improve information sharing based on an implemented knowledge management strategy. An organization’s capacity to gain more knowledge is aligned with the organization’s prior or existing captured knowledge. This case study attempted to understand the overall influence of a KMS within the corporate environment and knowledge exchange between personnel. The significance of this study was to help understand how organizations can improve the Return on Investment (ROI) of a knowledge management strategy within a knowledge-centric organization. A qualitative descriptive case study was the research design selected for this study. The lack of information sharing between personnel may create knowledge transfer gaps while restricting the decision-making processes. Developing a knowledge management strategy acceptable at all levels of the organization requires cooperation in support of a common organizational goal. Working with management and executive members to develop a protocol where knowledge transfer becomes a standard practice in multiple tiers of the organization. The knowledge transfer process could be measurable when focusing on specific elements of the organizational process, including personnel transition to help reduce time required understanding the job. The organization studied in this research acknowledged the need for improved knowledge management activities within the organization to help organize, retain, and distribute information throughout the workforce. Data produced from the study indicate three main themes including information management, organizational culture, and knowledge sharing within the workforce by the participants. These themes indicate a possible connection between an organizations KMS, the organizations culture, knowledge sharing, and knowledge transfer.

Keywords: knowledge transfer, management, knowledge management strategies, organizational learning, codification

Procedia PDF Downloads 428
28740 An Improved Parallel Algorithm of Decision Tree

Authors: Jiameng Wang, Yunfei Yin, Xiyu Deng

Abstract:

Parallel optimization is one of the important research topics of data mining at this stage. Taking Classification and Regression Tree (CART) parallelization as an example, this paper proposes a parallel data mining algorithm based on SSP-OGini-PCCP. Aiming at the problem of choosing the best CART segmentation point, this paper designs an S-SP model without data association; and in order to calculate the Gini index efficiently, a parallel OGini calculation method is designed. In addition, in order to improve the efficiency of the pruning algorithm, a synchronous PCCP pruning strategy is proposed in this paper. In this paper, the optimal segmentation calculation, Gini index calculation, and pruning algorithm are studied in depth. These are important components of parallel data mining. By constructing a distributed cluster simulation system based on SPARK, data mining methods based on SSP-OGini-PCCP are tested. Experimental results show that this method can increase the search efficiency of the best segmentation point by an average of 89%, increase the search efficiency of the Gini segmentation index by 3853%, and increase the pruning efficiency by 146% on average; and as the size of the data set increases, the performance of the algorithm remains stable, which meets the requirements of contemporary massive data processing.

Keywords: classification, Gini index, parallel data mining, pruning ahead

Procedia PDF Downloads 109
28739 Climate Changes Impact on Artificial Wetlands

Authors: Carla Idely Palencia-Aguilar

Abstract:

Artificial wetlands play an important role at Guasca Municipality in Colombia, not only because they are used for the agroindustry, but also because more than 45 species were found, some of which are endemic and migratory birds. Remote sensing was used to determine the changes in the area occupied by water of artificial wetlands by means of Aster and Modis images for different time periods. Evapotranspiration was also determined by three methods: Surface Energy Balance System-Su (SEBS) algorithm, Surface Energy Balance- Bastiaanssen (SEBAL) algorithm, and Potential Evapotranspiration- FAO. Empirical equations were also developed to determine the relationship between Normalized Difference Vegetation Index (NDVI) versus net radiation, ambient temperature and rain with an obtained R2 of 0.83. Groundwater level fluctuations on a daily basis were studied as well. Data from a piezometer placed next to the wetland were fitted with rain changes (with two weather stations located at the proximities of the wetlands) by means of multiple regression and time series analysis, the R2 from the calculated and measured values resulted was higher than 0.98. Information from nearby weather stations provided information for ordinary kriging as well as the results for the Digital Elevation Model (DEM) developed by using PCI software. Standard models (exponential, spherical, circular, gaussian, linear) to describe spatial variation were tested. Ordinary Cokriging between height and rain variables were also tested, to determine if the accuracy of the interpolation would increase. The results showed no significant differences giving the fact that the mean result of the spherical function for the rain samples after ordinary kriging was 58.06 and a standard deviation of 18.06. The cokriging using for the variable rain, a spherical function; for height variable, the power function and for the cross variable (rain and height), the spherical function had a mean of 57.58 and a standard deviation of 18.36. Threatens of eutrophication were also studied, given the unconsciousness of neighbours and government deficiency. Water quality was determined over the years; different parameters were studied to determine the chemical characteristics of water. In addition, 600 pesticides were studied by gas and liquid chromatography. Results showed that coliforms, nitrogen, phosphorous and prochloraz were the most significant contaminants.

Keywords: DEM, evapotranspiration, geostatistics, NDVI

Procedia PDF Downloads 107
28738 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 176
28737 Analyzing Strategic Alliances of Museums: The Case of Girona (Spain)

Authors: Raquel Camprubí

Abstract:

Cultural tourism has been postulated as relevant motivation for tourist over the world during the last decades. In this context, museums are the main attraction for cultural tourists who are seeking to connect with the history and culture of the visited place. From the point of view of an urban destination, museums and other cultural resources are essential to have a strong tourist supply at the destination, in order to be capable of catching attention and interest of cultural tourists. In particular, museums’ challenge is to be prepared to offer the best experience to their visitors without to forget their mission-based mainly on protection of its collection and other social goals. Thus, museums individually want to be competitive and have good positioning to achieve their strategic goals. The life cycle of the destination and the level of maturity of its tourism product influence the need of tourism agents to cooperate and collaborate among them, in order to rejuvenate their product and become more competitive as a destination. Additionally, prior studies have considered an approach of different models of a public and private partnership, and collaborative and cooperative relations developed among the agents of a tourism destination. However, there are no studies that pay special attention to museums and the strategic alliances developed to obtain mutual benefits. Considering this background, the purpose of this study is to analyze in what extent museums of a given urban destination have established strategic links and relations among them, in order to improve their competitive position at both individual and destination level. In order to achieve the aim of this study, the city of Girona (Spain) and the museums located in this city are taken as a case study. Data collection was conducted using in-depth interviews, in order to collect all the qualitative data related to nature, strengthen and purpose of the relational ties established among the museums of the city or other relevant tourism agents of the city. To conduct data analysis, a Social Network Analysis (SNA) approach was taken using UCINET software. Position of the agents in the network and structure of the network was analyzed, and qualitative data from interviews were used to interpret SNA results. Finding reveals the existence of strong ties among some of the museums of the city, particularly to create and promote joint products. Nevertheless, there were detected outsiders who have an individual strategy, without collaboration and cooperation with other museums or agents of the city. Results also show that some relational ties have an institutional origin, while others are the result of a long process of cooperation with common projects. Conclusions put in evidence that collaboration and cooperation of museums had been positive to increase the attractiveness of the museum and the city as a cultural destination. Future research and managerial implications are also mentioned.

Keywords: cultural tourism, competitiveness, museums, Social Network analysis

Procedia PDF Downloads 103
28736 Femtocell Stationed Flawless Handover in High Agility Trains

Authors: S. Dhivya, M. Abirami, M. Farjana Parveen, M. Keerthiga

Abstract:

The development of high-speed railway makes people’s lives more and more convenient; meanwhile, handover is the major problem on high-speed railway communication services. In order to overcome that drawback the architecture of Long-Term Evolution (LTE) femtocell networks is used to improve network performance, and the deployment of a femtocell is a key for bandwidth limitation and coverage issues in conventional mobile network system. To increase the handover performance this paper proposed a multiple input multiple output (MIMO) assisted handoff (MAHO) algorithm. It is a technique used in mobile telecom to transfer a mobile phone to a new radio channel with stronger signal strength and improved channel quality.

Keywords: flawless handover, high-speed train, home evolved Node B, LTE, mobile femtocell, RSS

Procedia PDF Downloads 458