Search results for: adaptive%20fuzzy%20PI%20controller
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1036

Search results for: adaptive%20fuzzy%20PI%20controller

616 A Resilience-Based Approach for Assessing Social Vulnerability in New Zealand's Coastal Areas

Authors: Javad Jozaei, Rob G. Bell, Paula Blackett, Scott A. Stephens

Abstract:

In the last few decades, Social Vulnerability Assessment (SVA) has been a favoured means in evaluating the susceptibility of social systems to drivers of change, including climate change and natural disasters. However, the application of SVA to inform responsive and practical strategies to deal with uncertain climate change impacts has always been challenging, and typically agencies resort back to conventional risk/vulnerability assessment. These challenges include complex nature of social vulnerability concepts which influence its applicability, complications in identifying and measuring social vulnerability determinants, the transitory social dynamics in a changing environment, and unpredictability of the scenarios of change that impacts the regime of vulnerability (including contention of when these impacts might emerge). Research suggests that the conventional quantitative approaches in SVA could not appropriately address these problems; hence, the outcomes could potentially be misleading and not fit for addressing the ongoing uncertain rise in risk. The second phase of New Zealand’s Resilience to Nature’s Challenges (RNC2) is developing a forward-looking vulnerability assessment framework and methodology that informs the decision-making and policy development in dealing with the changing coastal systems and accounts for complex dynamics of New Zealand’s coastal systems (including socio-economic, environmental and cultural). Also, RNC2 requires the new methodology to consider plausible drivers of incremental and unknowable changes, create mechanisms to enhance social and community resilience; and fits the New Zealand’s multi-layer governance system. This paper aims to analyse the conventional approaches and methodologies in SVA and offer recommendations for more responsive approaches that inform adaptive decision-making and policy development in practice. The research adopts a qualitative research design to examine different aspects of the conventional SVA processes, and the methods to achieve the research objectives include a systematic review of the literature and case study methods. We found that the conventional quantitative, reductionist and deterministic mindset in the SVA processes -with a focus the impacts of rapid stressors (i.e. tsunamis, floods)- show some deficiencies to account for complex dynamics of social-ecological systems (SES), and the uncertain, long-term impacts of incremental drivers. The paper will focus on addressing the links between resilience and vulnerability; and suggests how resilience theory and its underpinning notions such as the adaptive cycle, panarchy, and system transformability could address these issues, therefore, influence the perception of vulnerability regime and its assessment processes. In this regard, it will be argued that how a shift of paradigm from ‘specific resilience’, which focuses on adaptive capacity associated with the notion of ‘bouncing back’, to ‘general resilience’, which accounts for system transformability, regime shift, ‘bouncing forward’, can deliver more effective strategies in an era characterised by ongoing change and deep uncertainty.

Keywords: complexity, social vulnerability, resilience, transformation, uncertain risks

Procedia PDF Downloads 100
615 Learning from Flood: A Case Study of a Frequently Flooded Village in Hubei, China

Authors: Da Kuang

Abstract:

Resilience is a hotly debated topic in many research fields (e.g., engineering, ecology, society, psychology). In flood management studies, we are experiencing the paradigm shift from flood resistance to flood resilience. Flood resilience refers to tolerate flooding through adaptation or transformation. It is increasingly argued that our city as a social-ecological system holds the ability to learn from experience and adapt to flood rather than simply resist it. This research aims to investigate what kinds of adaptation knowledge the frequently flooded village learned from past experience and its advantages and limitations in coping with floods. The study area – Xinnongcun village, located in the west of Wuhan city, is a linear village and continuously suffered from both flash flood and drainage flood during the past 30 years. We have a field trip to the site in June 2017 and conducted semi-structured interviews with local residents. Our research summarizes two types of adaptation knowledge that people learned from the past floods. Firstly, at the village scale, it has formed a collective urban form which could help people live during both flood and dry season. All houses and front yards were elevated about 2m higher than the road. All the front yards in the village are linked and there is no barrier. During flooding time, people walk to neighbors through houses yards and boat to outside village on the lower road. Secondly, at individual scale, local people learned tacit knowledge of preparedness and emergency response to flood. Regarding the advantages and limitations, the adaptation knowledge could effectively help people to live with flood and reduce the chances of getting injuries. However, it cannot reduce local farmers’ losses on their agricultural land. After flood, it is impossible for local people to recover to the pre-disaster state as flood emerges during June and July will result in no harvest. Therefore, we argue that learning from past flood experience could increase people’s adaptive capacity. However, once the adaptive capacity cannot reduce people’s losses, it requires a transformation to a better regime.

Keywords: adaptation, flood resilience, tacit knowledge, transformation

Procedia PDF Downloads 333
614 Development of Adaptive Proportional-Integral-Derivative Feeding Mechanism for Robotic Additive Manufacturing System

Authors: Andy Alubaidy

Abstract:

In this work, a robotic additive manufacturing system (RAMS) that is capable of three-dimensional (3D) printing in six degrees of freedom (DOF) with very high accuracy and virtually on any surface has been designed and built. One of the major shortcomings in existing 3D printer technology is the limitation to three DOF, which results in prolonged fabrication time. Depending on the techniques used, it usually takes at least two hours to print small objects and several hours for larger objects. Another drawback is the size of the printed objects, which is constrained by the physical dimensions of most low-cost 3D printers, which are typically small. In such cases, large objects are produced by dividing them into smaller components that fit the printer’s workable area. They are then glued, bonded or otherwise attached to create the required object. Another shortcoming is material constraints and the need to fabricate a single part using different materials. With the flexibility of a six-DOF robot, the RAMS has been designed to overcome these problems. A feeding mechanism using an adaptive Proportional-Integral-Derivative (PID) controller is utilized along with a national instrument compactRIO (NI cRIO), an ABB robot, and off-the-shelf sensors. The RAMS have the ability to 3D print virtually anywhere in six degrees of freedom with very high accuracy. It is equipped with an ABB IRB 120 robot to achieve this level of accuracy. In order to convert computer-aided design (CAD) files to digital format that is acceptable to the robot, Hypertherm Robotic Software Inc.’s state-of-the-art slicing software called “ADDMAN” is used. ADDMAN is capable of converting any CAD file into RAPID code (the programing language for ABB robots). The robot uses the generated code to perform the 3D printing. To control the entire process, National Instrument (NI) compactRIO (cRio 9074), is connected and communicated with the robot and a feeding mechanism that is designed and fabricated. The feeding mechanism consists of two major parts, cold-end and hot-end. The cold-end consists of what is conventionally known as an extruder. Typically, a stepper-motor is used to control the push on the material, however, for optimum control, a DC motor is used instead. The hot-end consists of a melt-zone, nozzle, and heat-brake. The melt zone ensures a thorough melting effect and consistent output from the nozzle. Nozzles are made of brass for thermo-conductivity while the melt-zone is comprised of a heating block and a ceramic heating cartridge to transfer heat to the block. The heat-brake ensures that there is no heat creep-up effect as this would swell the material and prevent consistent extrusion. A control system embedded in the cRio is developed using NI Labview which utilizes adaptive PID to govern the heating cartridge in conjunction with a thermistor. The thermistor sends temperature feedback to the cRio, which will issue heat increase or decrease based on the system output. Since different materials have different melting points, our system will allow us to adjust the temperature and vary the material.

Keywords: robotic, additive manufacturing, PID controller, cRIO, 3D printing

Procedia PDF Downloads 216
613 Prioritizing Biodiversity Conservation Areas based on the Vulnerability and the Irreplaceability Framework in Mexico

Authors: Alma Mendoza-Ponce, Rogelio Corona-Núñez, Florian Kraxner

Abstract:

Mexico is a megadiverse country and it has nearly halved its natural vegetation in the last century due to agricultural and livestock expansion. Impacts of land use cover change and climate change are unevenly distributed and spatial prioritization to minimize the affectations on biodiversity is crucial. Global and national efforts for prioritizing biodiversity conservation show that ~33% to 45% of Mexico should be protected. The width of these targets makes difficult to lead resources. We use a framework based on vulnerability and irreplaceability to prioritize conservation efforts in Mexico. Vulnerability considered exposure, sensitivity and adaptive capacity under two scenarios (business as usual, BAU based, on the SSP2 and RCP 4.5 and a Green scenario, based on the SSP1 and the RCP 2.6). Exposure to land use is the magnitude of change from natural vegetation to anthropogenic covers while exposure to climate change is the difference between current and future values for both scenarios. Sensitivity was considered as the number of endemic species of terrestrial vertebrates which are critically endangered and endangered. Adaptive capacity is used as the ration between the percentage of converted area (natural to anthropogenic) and the percentage of protected area at municipality level. The results suggest that by 2050, between 11.6 and 13.9% of Mexico show vulnerability ≥ 50%, and by 2070, between 12.0 and 14.8%, in the Green and BAU scenario, respectively. From an ecosystem perspective cloud forests, followed by tropical dry forests, natural grasslands and temperate forests will be the most vulnerable (≥ 50%). Amphibians are the most threatened vertebrates; 62% of the endemic amphibians are critically endangered or endangered while 39%, 12% and 9% of the mammals, birds, and reptiles, respectively. However, the distribution of these amphibians counts for only 3.3% of the country, while mammals, birds, and reptiles in these categories represent 10%, 16% and 29% of Mexico. There are 5 municipalities out of the 2,457 that Mexico has that represent 31% of the most vulnerable areas (70%).These municipalities account for 0.05% of Mexico. This multiscale approach can be used to address resources to conservation targets as ecosystems, municipalities or species considering land use cover change, climate change and biodiversity uniqueness.

Keywords: biodiversity, climate change, land use change, Mexico, vulnerability

Procedia PDF Downloads 164
612 Gestalt in Music and Brain: A Non-Linear Chaos Based Study with Detrended/Adaptive Fractal Analysis

Authors: Shankha Sanyal, Archi Banerjee, Sayan Biswas, Sourya Sengupta, Sayan Nag, Ranjan Sengupta, Dipak Ghosh

Abstract:

The term ‘gestalt’ has been widely used in the field of psychology which defined the perception of human mind to group any object not in part but as a 'unified' whole. Music, in general, is polyphonic - i.e. a combination of a number of pure tones (frequencies) mixed together in a manner that sounds harmonious. The study of human brain response due to different frequency groups of the acoustic signal can give us an excellent insight regarding the neural and functional architecture of brain functions. Hence, the study of music cognition using neuro-biosensors is becoming a rapidly emerging field of research. In this work, we have tried to analyze the effect of different frequency bands of music on the various frequency rhythms of human brain obtained from EEG data. Four widely popular Rabindrasangeet clips were subjected to Wavelet Transform method for extracting five resonant frequency bands from the original music signal. These frequency bands were initially analyzed with Detrended/Adaptive Fractal analysis (DFA/AFA) methods. A listening test was conducted on a pool of 100 respondents to assess the frequency band in which the music becomes non-recognizable. Next, these resonant frequency bands were presented to 20 subjects as auditory stimulus and EEG signals recorded simultaneously in 19 different locations of the brain. The recorded EEG signals were noise cleaned and subjected again to DFA/AFA technique on the alpha, theta and gamma frequency range. Thus, we obtained the scaling exponents from the two methods in alpha, theta and gamma EEG rhythms corresponding to different frequency bands of music. From the analysis of music signal, it is seen that loss of recognition is proportional to the loss of long range correlation in the signal. From the EEG signal analysis, we obtain frequency specific arousal based response in different lobes of brain as well as in specific EEG bands corresponding to musical stimuli. In this way, we look to identify a specific frequency band beyond which the music becomes non-recognizable and below which in spite of the absence of other bands the music is perceivable to the audience. This revelation can be of immense importance when it comes to the field of cognitive music therapy and researchers of creativity.

Keywords: AFA, DFA, EEG, gestalt in music, Hurst exponent

Procedia PDF Downloads 331
611 Duo Lingo: Learning Languages through Play

Authors: Yara Bajnaid, Malak Zaidan, Eman Dakkak

Abstract:

This research explores the use of Artificial Intelligence in Duolingo, a popular mobile application for language learning. Duolingo's success hinges on its gamified approach and adaptive learning system, both heavily reliant on AI functionalities. The research also analyzes user feedback regarding Duolingo's AI functionalities. While a significant majority (70%) consider Duolingo a reliable tool for language learning, there's room for improvement. Overall, AI plays a vital role in personalizing the learning journey and delivering interactive exercises. However, continuous improvement based on user feedback can further enhance the effectiveness of Duolingo's AI functionalities.

Keywords: AI, Duolingo, language learning, application

Procedia PDF Downloads 46
610 An Interactive Methodology to Demonstrate the Level of Effectiveness of the Synthesis of Local-Area Networks

Authors: W. Shin, Y. Kim

Abstract:

This study focuses on disconfirming that wide-area networks can be made mobile, highly-available, and wireless. This methodological test shows that IPv7 and context-free grammar are mismatched. In the cases of robots, a similar tendency is also revealed. Further, we also prove that public-private key pairs could be built embedded, adaptive, and wireless. Finally, we disconfirm that although hash tables can be made distributed, interposable, and autonomous, XML and DNS can interfere to realize this purpose. Our experiments soon proved that exokernelizing our replicated Knesis keyboards was more significant than interrupting them. Our experiments exhibited degraded average sampling rate.

Keywords: collaborative communication, DNS, local-area networks, XML

Procedia PDF Downloads 185
609 The Immunology Evolutionary Relationship between Signal Transducer and Activator of Transcription Genes from Three Different Shrimp Species in Response to White Spot Syndrome Virus Infection

Authors: T. C. C. Soo, S. Bhassu

Abstract:

Unlike the common presence of both innate and adaptive immunity in vertebrates, crustaceans, in particular, shrimps, have been discovered to possess only innate immunity. This further emphasizes the importance of innate immunity within shrimps in pathogenic resistance. Under the study of pathogenic immune challenge, different shrimp species actually exhibit varying degrees of immune resistance towards the same pathogen. Furthermore, even within the same shrimp species, different batches of challenged shrimps can have different strengths of immune defence. Several important pathways are activated within shrimps during pathogenic infection. One of them is JAK-STAT pathway that is activated during bacterial, viral and fungal infections by which STAT(Signal Transducer and Activator of Transcription) gene is the core element of the pathway. Based on theory of Central Dogma, the genomic information is transmitted in the order of DNA, RNA and protein. This study is focused in uncovering the important evolutionary patterns present within the DNA (non-coding region) and RNA (coding region). The three shrimp species involved are Macrobrachium rosenbergii, Penaeus monodon and Litopenaeus vannamei which all possess commercial significance. The shrimp species were challenged with a famous penaeid shrimp virus called white spot syndrome virus (WSSV) which can cause serious lethality. Tissue samples were collected during time intervals of 0h, 3h, 6h, 12h, 24h, 36h and 48h. The DNA and RNA samples were then extracted using conventional kits from the hepatopancreas tissue samples. PCR technique together with designed STAT gene conserved primers were utilized for identification of the STAT coding sequences using RNA-converted cDNA samples and subsequent characterization using various bioinformatics approaches including Ramachandran plot, ProtParam and SWISS-MODEL. The varying levels of immune STAT gene activation for the three shrimp species during WSSV infection were confirmed using qRT-PCR technique. For one sample, three biological replicates with three technical replicates each were used for qRT-PCR. On the other hand, DNA samples were important for uncovering the structural variations within the genomic region of STAT gene which would greatly assist in understanding the STAT protein functional variations. The partially-overlapping primers technique was used for the genomic region sequencing. The evolutionary inferences and event predictions were then conducted through the Bayesian Inference method using all the acquired coding and non-coding sequences. This was supplemented by the construction of conventional phylogenetic trees using Maximum likelihood method. The results showed that adaptive evolution caused STAT gene sequence mutations between different shrimp species which led to evolutionary divergence event. Subsequently, the divergent sites were correlated to the differing expressions of STAT gene. Ultimately, this study assists in knowing the shrimp species innate immune variability and selection of disease resistant shrimps for breeding purpose. The deeper understanding of STAT gene evolution from the perspective of both purifying and adaptive approaches not only can provide better immunological insight among shrimp species, but also can be used as a good reference for immunological studies in humans or other model organisms.

Keywords: gene evolution, JAK-STAT pathway, immunology, STAT gene

Procedia PDF Downloads 150
608 Improvement of the Numerical Integration's Quality in Meshless Methods

Authors: Ahlem Mougaida, Hedi Bel Hadj Salah

Abstract:

Several methods are suggested to improve the numerical integration in Galerkin weak form for Meshless methods. In fact, integrating without taking into account the characteristics of the shape functions reproduced by Meshless methods (rational functions, with compact support etc.), causes a large integration error that influences the PDE’s approximate solution. Comparisons between different methods of numerical integration for rational functions are discussed and compared. The algorithms are implemented in Matlab. Finally, numerical results were presented to prove the efficiency of our algorithms in improving results.

Keywords: adaptive methods, meshless, numerical integration, rational quadrature

Procedia PDF Downloads 362
607 An Adaptive Conversational AI Approach for Self-Learning

Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo

Abstract:

In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.

Keywords: conversational AI, chatbot, dialog management, semantic analysis

Procedia PDF Downloads 135
606 Governance of Climate Adaptation Through Artificial Glacier Technology: Lessons Learnt from Leh (Ladakh, India) In North-West Himalaya

Authors: Ishita Singh

Abstract:

Social-dimension of Climate Change is no longer peripheral to Science, Technology and Innovation (STI). Indeed, STI is being mobilized to address small farmers’ vulnerability and adaptation to Climate Change. The experiences from the cold desert of Leh (Ladakh) in North-West Himalaya illustrate the potential of STI to address the challenges of Climate Change and the needs of small farmers through the use of Artificial Glacier Techniques. Small farmers have a unique technique of water harvesting to augment irrigation, called “Artificial Glaciers” - an intricate network of water channels and dams along the upper slope of a valley that are located closer to villages and at lower altitudes than natural glaciers. It starts to melt much earlier and supplements additional irrigation to small farmers’ improving their livelihoods. Therefore, the issue of vulnerability, adaptive capacity and adaptation strategy needs to be analyzed in a local context and the communities as well as regions where people live. Leh (Ladakh) in North-West Himalaya provides a Case Study for exploring the ways in which adaptation to Climate Change is taking place at a community scale using Artificial Glacier Technology. With the above backdrop, an attempt has been made to analyze the rural poor households' vulnerability and adaptation practices to Climate Change using this technology, thereby drawing lessons on vulnerability-livelihood interactions in the cold desert of Leh (Ladakh) in North-West Himalaya, India. The study is based on primary data and information collected from 675 households confined to 27 villages of Leh (Ladakh) in North-West Himalaya, India. It reveals that 61.18% of the population is driving livelihoods from agriculture and allied activities. With increased irrigation potential due to the use of Artificial Glaciers, food security has been assured to 77.56% of households and health vulnerability has been reduced in 31% of households. Seasonal migration as a livelihood diversification mechanism has declined in nearly two-thirds of households, thereby improving livelihood strategies. Use of tactical adaptations by small farmers in response to persistent droughts, such as selling livestock, expanding agriculture lands, and use of relief cash and foods, have declined to 20.44%, 24.74% and 63% of households. However, these measures are unsustainable on a long-term basis. The role of policymakers and societal stakeholders becomes important in this context. To address livelihood challenges, the role of technology is critical in a multidisciplinary approach involving multilateral collaboration among different stakeholders. The presence of social entrepreneurs and new actors on the adaptation scene is necessary to bring forth adaptation measures. Better linkage between Science and Technology policies, together with other policies, should be encouraged. Better health care, access to safe drinking water, better sanitary conditions, and improved standards of education and infrastructure are effective measures to enhance a community’s adaptive capacity. However, social transfers for supporting climate adaptive capacity require significant amounts of additional investment. Developing institutional mechanisms for specific adaptation interventions can be one of the most effective ways of implementing a plan to enhance adaptation and build resilience.

Keywords: climate change, adaptation, livelihood, stakeholders

Procedia PDF Downloads 69
605 Transformation of the Traditional Landscape of Kabul Old City: A Study for Its Conservation

Authors: Mohammad Umar Azizi, Tetsuya Ando

Abstract:

This study investigates the transformation of the traditional landscape of Kabul Old City through an examination of five case study areas. Based on physical observation, three types of houses are found: traditional, mixed and modern. Firstly, characteristics of the houses are described according to construction materials and the number of stories. Secondly, internal and external factors are considered in order to implement a conservation plan. Finally, an adaptive conservation plan is suggested to protect the traditional landscape of Kabul Old City.

Keywords: conservation, district 1, Kabul Old City, landscape, transformation, traditional houses

Procedia PDF Downloads 219
604 Cost Overruns in Mega Projects: Project Progress Prediction with Probabilistic Methods

Authors: Yasaman Ashrafi, Stephen Kajewski, Annastiina Silvennoinen, Madhav Nepal

Abstract:

Mega projects either in construction, urban development or energy sectors are one of the key drivers that build the foundation of wealth and modern civilizations in regions and nations. Such projects require economic justification and substantial capital investment, often derived from individual and corporate investors as well as governments. Cost overruns and time delays in these mega projects demands a new approach to more accurately predict project costs and establish realistic financial plans. The significance of this paper is that the cost efficiency of megaprojects will improve and decrease cost overruns. This research will assist Project Managers (PMs) to make timely and appropriate decisions about both cost and outcomes of ongoing projects. This research, therefore, examines the oil and gas industry where most mega projects apply the classic methods of Cost Performance Index (CPI) and Schedule Performance Index (SPI) and rely on project data to forecast cost and time. Because these projects are always overrun in cost and time even at the early phase of the project, the probabilistic methods of Monte Carlo Simulation (MCS) and Bayesian Adaptive Forecasting method were used to predict project cost at completion of projects. The current theoretical and mathematical models which forecast the total expected cost and project completion date, during the execution phase of an ongoing project will be evaluated. Earned Value Management (EVM) method is unable to predict cost at completion of a project accurately due to the lack of enough detailed project information especially in the early phase of the project. During the project execution phase, the Bayesian adaptive forecasting method incorporates predictions into the actual performance data from earned value management and revises pre-project cost estimates, making full use of the available information. The outcome of this research is to improve the accuracy of both cost prediction and final duration. This research will provide a warning method to identify when current project performance deviates from planned performance and crates an unacceptable gap between preliminary planning and actual performance. This warning method will support project managers to take corrective actions on time.

Keywords: cost forecasting, earned value management, project control, project management, risk analysis, simulation

Procedia PDF Downloads 402
603 Collaborative and Experimental Cultures in Virtual Reality Journalism: From the Perspective of Content Creators

Authors: Radwa Mabrook

Abstract:

Virtual Reality (VR) content creation is a complex and an expensive process, which requires multi-disciplinary teams of content creators. Grant schemes from technology companies help media organisations to explore the VR potential in journalism and factual storytelling. Media organisations try to do as much as they can in-house, but they may outsource due to time constraints and skill availability. Journalists, game developers, sound designers and creative artists work together and bring in new cultures of work. This study explores the collaborative experimental nature of VR content creation, through tracing every actor involved in the process and examining their perceptions of the VR work. The study builds on Actor Network Theory (ANT), which decomposes phenomena into their basic elements and traces the interrelations among them. Therefore, the researcher conducted 22 semi-structured interviews with VR content creators between November 2017 and April 2018. Purposive and snowball sampling techniques allowed the researcher to recruit fact-based VR content creators from production studios and media organisations, as well as freelancers. Interviews lasted up to three hours, and they were a mix of Skype calls and in-person interviews. Participants consented for their interviews to be recorded, and for their names to be revealed in the study. The researcher coded interviews’ transcripts in Nvivo software, looking for key themes that correspond with the research questions. The study revealed that VR content creators must be adaptive to change, open to learn and comfortable with mistakes. The VR content creation process is very iterative because VR has no established work flow or visual grammar. Multi-disciplinary VR team members often speak different languages making it hard to communicate. However, adaptive content creators perceive VR work as a fun experience and an opportunity to learn. The traditional sense of competition and the strive for information exclusivity are now replaced by a strong drive for knowledge sharing. VR content creators are open to share their methods of work and their experiences. They target to build a collaborative network that aims to harness VR technology for journalism and factual storytelling. Indeed, VR is instilling collaborative and experimental cultures in journalism.

Keywords: collaborative culture, content creation, experimental culture, virtual reality

Procedia PDF Downloads 127
602 Electrocardiogram Signal Denoising Using a Hybrid Technique

Authors: R. Latif, W. Jenkal, A. Toumanari, A. Hatim

Abstract:

This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.

Keywords: hybrid technique, ADTF, DWT, thresholding, ECG signal

Procedia PDF Downloads 318
601 An Efficient Strategy for Relay Selection in Multi-Hop Communication

Authors: Jung-In Baik, Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

This paper proposes an efficient relaying algorithm to obtain diversity for improving the reliability of a signal. The algorithm achieves time or space diversity gain by multiple versions of the same signal through two routes. Relays are separated between a source and destination. The routes between the source and destination are set adaptive in order to deal with different channels and noises. The routes consist of one or more relays and the source transmits its signal to the destination through the routes. The signals from the relays are combined and detected at the destination. The proposed algorithm provides a better performance than the conventional algorithms in bit error rate (BER).

Keywords: multi-hop, OFDM, relay, relaying selection

Procedia PDF Downloads 443
600 Using Probe Person Data for Travel Mode Detection

Authors: Muhammad Awais Shafique, Eiji Hato, Hideki Yaginuma

Abstract:

Recently GPS data is used in a lot of studies to automatically reconstruct travel patterns for trip survey. The aim is to minimize the use of questionnaire surveys and travel diaries so as to reduce their negative effects. In this paper data acquired from GPS and accelerometer embedded in smart phones is utilized to predict the mode of transportation used by the phone carrier. For prediction, Support Vector Machine (SVM) and Adaptive boosting (AdaBoost) are employed. Moreover a unique method to improve the prediction results from these algorithms is also proposed. Results suggest that the prediction accuracy of AdaBoost after improvement is relatively better than the rest.

Keywords: accelerometer, AdaBoost, GPS, mode prediction, support vector machine

Procedia PDF Downloads 358
599 A Digital Twin Approach to Support Real-time Situational Awareness and Intelligent Cyber-physical Control in Energy Smart Buildings

Authors: Haowen Xu, Xiaobing Liu, Jin Dong, Jianming Lian

Abstract:

Emerging smart buildings often employ cyberinfrastructure, cyber-physical systems, and Internet of Things (IoT) technologies to increase the automation and responsiveness of building operations for better energy efficiency and lower carbon emission. These operations include the control of Heating, Ventilation, and Air Conditioning (HVAC) and lighting systems, which are often considered a major source of energy consumption in both commercial and residential buildings. Developing energy-saving control models for optimizing HVAC operations usually requires the collection of high-quality instrumental data from iterations of in-situ building experiments, which can be time-consuming and labor-intensive. This abstract describes a digital twin approach to automate building energy experiments for optimizing HVAC operations through the design and development of an adaptive web-based platform. The platform is created to enable (a) automated data acquisition from a variety of IoT-connected HVAC instruments, (b) real-time situational awareness through domain-based visualizations, (c) adaption of HVAC optimization algorithms based on experimental data, (d) sharing of experimental data and model predictive controls through web services, and (e) cyber-physical control of individual instruments in the HVAC system using outputs from different optimization algorithms. Through the digital twin approach, we aim to replicate a real-world building and its HVAC systems in an online computing environment to automate the development of building-specific model predictive controls and collaborative experiments in buildings located in different climate zones in the United States. We present two case studies to demonstrate our platform’s capability for real-time situational awareness and cyber-physical control of the HVAC in the flexible research platforms within the Oak Ridge National Laboratory (ORNL) main campus. Our platform is developed using adaptive and flexible architecture design, rendering the platform generalizable and extendable to support HVAC optimization experiments in different types of buildings across the nation.

Keywords: energy-saving buildings, digital twins, HVAC, cyber-physical system, BIM

Procedia PDF Downloads 107
598 State of the Art on the Recommendation Techniques of Mobile Learning Activities

Authors: Nassim Dennouni, Yvan Peter, Luigi Lancieri, Zohra Slama

Abstract:

The objective of this article is to make a bibliographic study on the recommendation of mobile learning activities that are used as part of the field trip scenarios. Indeed, the recommendation systems are widely used in the context of mobility because they can be used to provide learning activities. These systems should take into account the history of visits and teacher pedagogy to provide adaptive learning according to the instantaneous position of the learner. To achieve this objective, we review the existing literature on field trip scenarios to recommend mobile learning activities.

Keywords: mobile learning, field trip, mobile learning activities, collaborative filtering, recommendation system, point of interest, ACO algorithm

Procedia PDF Downloads 444
597 Privacy Policy Prediction for Uploaded Image on Content Sharing Sites

Authors: Pallavi Mane, Nikita Mankar, Shraddha Mazire, Rasika Pashankar

Abstract:

Content sharing sites are very useful in sharing information and images. However, with the increasing demand of content sharing sites privacy and security concern have also increased. There is need to develop a tool for controlling user access to their shared content. Therefore, we are developing an Adaptive Privacy Policy Prediction (A3P) system which is helpful for users to create privacy settings for their images. We propose the two-level framework which assigns the best available privacy policy for the users images according to users available histories on the site.

Keywords: online information services, prediction, security and protection, web based services

Procedia PDF Downloads 356
596 A Method for Processing Unwanted Target Caused by Reflection in Secondary Surveillance Radar

Authors: Khanh D.Do, Loi V.Nguyen, Thanh N.Nguyen, Thang M.Nguyen, Vu T.Tran

Abstract:

Along with the development of Secondary surveillance radar (SSR) in air traffic surveillance systems, the Multipath phenomena has always been a noticeable problem. This following article discusses the geometrical aspect and power aspect of the Multipath interference caused by reflection in SSR and proposes a method to deal with these unwanted multipath targets (ghosts) by false-target position predicting and adaptive target suppressing. A field-experiment example is mentioned at the end of the article to demonstrate the efficiency of this measure.

Keywords: multipath, secondary surveillance radar, digital signal processing, reflection

Procedia PDF Downloads 159
595 Souk Waqif in Old Doha, Qatar: Cultural Heritage, Urban Regeneration, and Sustainability

Authors: Djamel Boussaa

Abstract:

Cultural heritage and tourism have become during the last two decades dynamic areas of development in the world. The idea of heritage is crucial to the critical decision-making process as to how irreplaceable resources are to be utilized by people of the present or conserved for future generations in a fast changing world. In view of the importance of ‘heritage’ to the development of a tourist destination the emphasis on developing appropriate adaptive reuse strategies cannot be overemphasized. In October 1999, the 12th general assembly of the ICOMOS in Mexico stated, that in the context of sustainable development, two interrelated issues need urgent attention, cultural tourism and historic towns and cities. These two issues underscore the fact that historic resources are non-renewable, belonging to all of humanity. Without adequate adaptive reuse actions to ensure a sustainable future for these historic resources, may lead to their complete vanishing. The growth of tourism and its role in dispersing cultural heritage to everyone is developing rapidly. According to the World Tourism Organization, natural and cultural heritage resources are and will remain motivating factors for travel in the foreseeable future. According to the experts, people choose travel destinations where they can learn about traditional and distinct cultures in their historic context. The Qatar rich urban heritage is now being recognized as a valuable resource for future development. This paper discusses the role of cultural heritage and tourism in regenerating Souk Waqif, and consequently the city of Doha. Therefore, in order to use cultural heritage wisely, it will be necessary to position heritage as an essential element of sustainable development, giving particular attention to cultural heritage and tourism. The research methodology is based on an empirical survey of the situation, based on several visits, meetings and interviews with the local heritage players. The rehabilitation project initiated since 2004 will be examined and assessed. Therefore, there is potential to assess the situation and propose directions for a sustainable future to this historic landmark. Conservation for the sake of conservation appears to be an outdated concept. Many irreplaceable natural and cultural sites are being compromised because local authorities are not giving economic consideration to the value of rehabilitating such sites. The question to be raised here is 'How can cultural heritage be used wisely for tourism without compromising its social sustainability within the emerging global world?'

Keywords: cultural heritage, tourism, regeneration, economy, social sustainability

Procedia PDF Downloads 418
594 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows

Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid

Abstract:

Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.

Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil

Procedia PDF Downloads 128
593 Layouting Phase II of New Priok Using Adaptive Port Planning Frameworks

Authors: Mustarakh Gelfi, Tiedo Vellinga, Poonam Taneja, Delon Hamonangan

Abstract:

The development of New Priok/Kalibaru as an expansion terminal of the old port has been being done by IPC (Indonesia Port Cooperation) together with the subsidiary company, Port Developer (PT Pengembangan Pelabuhan Indonesia). As stated in the master plan, from 2 phases that had been proposed, phase I has shown its form and even Container Terminal I has been operated in 2016. It was planned principally, the development will be divided into Phase I (2013-2018) consist of 3 container terminals and 2 product terminals and Phase II (2018-2023) consist of 4 container terminals. In fact, the master plan has to be changed due to some major uncertainties which were escaped in prediction. This study is focused on the design scenario of phase II (2035- onwards) to deal with future uncertainty. The outcome is the robust design of phase II of the Kalibaru Terminal taking into account the future changes. Flexibility has to be a major goal in such a large infrastructure project like New Priok in order to deal and manage future uncertainty. The phasing of project needs to be adapted and re-look frequently before being irrelevant to future challenges. One of the frameworks that have been developed by an expert in port planning is Adaptive Port Planning (APP) with scenario-based planning. The idea behind APP framework is the adaptation that might be needed at any moment as an answer to a challenge. It is a continuous procedure that basically aims to increase the lifespan of waterborne transport infrastructure by increasing flexibility in the planning, contracting and design phases. Other methods used in this study are brainstorming with the port authority, desk study, interview and site visit to the real project. The result of the study is expected to be the insight for the port authority of Tanjung Priok over the future look and how it will impact the design of the port. There will be guidelines to do the design in an uncertain environment as well. Solutions of flexibility can be divided into: 1 - Physical solutions, all the items related hard infrastructure in the projects. The common things in this type of solution are using modularity, standardization, multi-functional, shorter and longer design lifetime, reusability, etc. 2 - Non-physical solutions, usually related to the planning processes, decision making and management of the projects. To conclude, APP framework seems quite robust to deal with the problem of designing phase II of New Priok Project for such a long period.

Keywords: Indonesia port, port's design, port planning, scenario-based planning

Procedia PDF Downloads 238
592 IP Management Tools, Strategies, Best Practices, and Business Models for Pharmaceutical Products

Authors: Nerella Srinivas

Abstract:

This study investigates the role of intellectual property (IP) management in pharmaceutical development, focusing on tools, strategies, and business models for leveraging IP effectively. Using a mixed-methods approach, we conducted case studies and qualitative analyses of IP management frameworks within the pharmaceutical sector. Our methodology included a review of IP tools tailored for pharmaceutical applications, strategic IP models for maximizing competitive advantages, and best practices for organizational efficiency. Findings emphasize the importance of understanding IP law and adopting adaptive strategies, illustrating how IP management can drive industry growth.

Keywords: intellectual property management, pharmaceutical products, IP tools, IP strategies, best practices, business models, innovation

Procedia PDF Downloads 11
591 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 408
590 Building and Tree Detection Using Multiscale Matched Filtering

Authors: Abdullah H. Özcan, Dilara Hisar, Yetkin Sayar, Cem Ünsalan

Abstract:

In this study, an automated building and tree detection method is proposed using DSM data and true orthophoto image. A multiscale matched filtering is used on DSM data. Therefore, first watershed transform is applied. Then, Otsu’s thresholding method is used as an adaptive threshold to segment each watershed region. Detected objects are masked with NDVI to separate buildings and trees. The proposed method is able to detect buildings and trees without entering any elevation threshold. We tested our method on ISPRS semantic labeling dataset and obtained promising results.

Keywords: building detection, local maximum filtering, matched filtering, multiscale

Procedia PDF Downloads 318
589 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation

Authors: Mohammad Abu-Shaira, Weishi Shi

Abstract:

Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.

Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression

Procedia PDF Downloads 10
588 Investigating the Influence of Activation Functions on Image Classification Accuracy via Deep Convolutional Neural Network

Authors: Gulfam Haider, sana danish

Abstract:

Convolutional Neural Networks (CNNs) have emerged as powerful tools for image classification, and the choice of optimizers profoundly affects their performance. The study of optimizers and their adaptations remains a topic of significant importance in machine learning research. While numerous studies have explored and advocated for various optimizers, the efficacy of these optimization techniques is still subject to scrutiny. This work aims to address the challenges surrounding the effectiveness of optimizers by conducting a comprehensive analysis and evaluation. The primary focus of this investigation lies in examining the performance of different optimizers when employed in conjunction with the popular activation function, Rectified Linear Unit (ReLU). By incorporating ReLU, known for its favorable properties in prior research, the aim is to bolster the effectiveness of the optimizers under scrutiny. Specifically, we evaluate the adjustment of these optimizers with both the original Softmax activation function and the modified ReLU activation function, carefully assessing their impact on overall performance. To achieve this, a series of experiments are conducted using a well-established benchmark dataset for image classification tasks, namely the Canadian Institute for Advanced Research dataset (CIFAR-10). The selected optimizers for investigation encompass a range of prominent algorithms, including Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD). The performance analysis encompasses a comprehensive evaluation of the classification accuracy, convergence speed, and robustness of the CNN models trained with each optimizer. Through rigorous experimentation and meticulous assessment, we discern the strengths and weaknesses of the different optimization techniques, providing valuable insights into their suitability for image classification tasks. By conducting this in-depth study, we contribute to the existing body of knowledge surrounding optimizers in CNNs, shedding light on their performance characteristics for image classification. The findings gleaned from this research serve to guide researchers and practitioners in making informed decisions when selecting optimizers and activation functions, thus advancing the state-of-the-art in the field of image classification with convolutional neural networks.

Keywords: deep neural network, optimizers, RMsprop, ReLU, stochastic gradient descent

Procedia PDF Downloads 125
587 Cross-Layer Design of Event-Triggered Adaptive OFDMA Resource Allocation Protocols with Application to Vehicle Clusters

Authors: Shaban Guma, Naim Bajcinca

Abstract:

We propose an event-triggered algorithm for the solution of a distributed optimization problem by means of the projected subgradient method. Thereby, we invoke an OFDMA resource allocation scheme by applying an event-triggered sensitivity analysis at the access point. The optimal resource assignment of the subcarriers to the involved wireless nodes is carried out by considering the sensitivity analysis of the overall objective function as defined by the control of vehicle clusters with respect to the information exchange between the nodes.

Keywords: consensus, cross-layer, distributed, event-triggered, multi-vehicle, protocol, resource, OFDMA, wireless

Procedia PDF Downloads 329