Search results for: fuzzy genetic network programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7319

Search results for: fuzzy genetic network programming

4019 IEEE802.15.4e Based Scheduling Mechanisms and Systems for Industrial Internet of Things

Authors: Ho-Ting Wu, Kai-Wei Ke, Bo-Yu Huang, Liang-Lin Yan, Chun-Ting Lin

Abstract:

With the advances in advanced technology, wireless sensor network (WSN) has become one of the most promising candidates to implement the wireless industrial internet of things (IIOT) architecture. However, the legacy IEEE 802.15.4 based WSN technology such as Zigbee system cannot meet the stringent QoS requirement of low powered, real-time, and highly reliable transmission imposed by the IIOT environment. Recently, the IEEE society developed IEEE 802.15.4e Time Slotted Channel Hopping (TSCH) access mode to serve this purpose. Furthermore, the IETF 6TiSCH working group has proposed standards to integrate IEEE 802.15.4e with IPv6 protocol smoothly to form a complete protocol stack for IIOT. In this work, we develop key network technologies for IEEE 802.15.4e based wireless IIoT architecture, focusing on practical design and system implementation. We realize the OpenWSN-based wireless IIOT system. The system architecture is divided into three main parts: web server, network manager, and sensor nodes. The web server provides user interface, allowing the user to view the status of sensor nodes and instruct sensor nodes to follow commands via user-friendly browser. The network manager is responsible for the establishment, maintenance, and management of scheduling and topology information. It executes centralized scheduling algorithm, sends the scheduling table to each node, as well as manages the sensing tasks of each device. Sensor nodes complete the assigned tasks and sends the sensed data. Furthermore, to prevent scheduling error due to packet loss, a schedule inspection mechanism is implemented to verify the correctness of the schedule table. In addition, when network topology changes, the system will act to generate a new schedule table based on the changed topology for ensuring the proper operation of the system. To enhance the system performance of such system, we further propose dynamic bandwidth allocation and distributed scheduling mechanisms. The developed distributed scheduling mechanism enables each individual sensor node to build, maintain and manage the dedicated link bandwidth with its parent and children nodes based on locally observed information by exchanging the Add/Delete commands via two processes. The first process, termed as the schedule initialization process, allows each sensor node pair to identify the available idle slots to allocate the basic dedicated transmission bandwidth. The second process, termed as the schedule adjustment process, enables each sensor node pair to adjust their allocated bandwidth dynamically according to the measured traffic loading. Such technology can sufficiently satisfy the dynamic bandwidth requirement in the frequently changing environments. Last but not least, we propose a packet retransmission scheme to enhance the system performance of the centralized scheduling algorithm when the packet delivery rate (PDR) is low. We propose a multi-frame retransmission mechanism to allow every single network node to resend each packet for at least the predefined number of times. The multi frame architecture is built according to the number of layers of the network topology. Performance results via simulation reveal that such retransmission scheme is able to provide sufficient high transmission reliability while maintaining low packet transmission latency. Therefore, the QoS requirement of IIoT can be achieved.

Keywords: IEEE 802.15.4e, industrial internet of things (IIOT), scheduling mechanisms, wireless sensor networks (WSN)

Procedia PDF Downloads 144
4018 Target and Biomarker Identification Platform to Design New Drugs against Aging and Age-Related Diseases

Authors: Peter Fedichev

Abstract:

We studied fundamental aspects of aging to develop a mathematical model of gene regulatory network. We show that aging manifests itself as an inherent instability of gene network leading to exponential accumulation of regulatory errors with age. To validate our approach we studied age-dependent omic data such as transcriptomes, metabolomes etc. of different model organisms and humans. We build a computational platform based on our model to identify the targets and biomarkers of aging to design new drugs against aging and age-related diseases. As biomarkers of aging, we choose the rate of aging and the biological age since they completely determine the state of the organism. Since rate of aging rapidly changes in response to an external stress, this kind of biomarker can be useful as a tool for quantitative efficacy assessment of drugs, their combinations, dose optimization, chronic toxicity estimate, personalized therapies selection, clinical endpoints achievement (within clinical research), and death risk assessments. According to our model, we propose a method for targets identification for further interventions against aging and age-related diseases. Being a biotech company, we offer a complete pipeline to develop an anti-aging drug-candidate.

Keywords: aging, longevity, biomarkers, senescence

Procedia PDF Downloads 263
4017 Identification of 332G>A Polymorphism in Exon 3 of the Leptin Gene and Partially Effects on Body Size and Tail Dimension in Sanjabi Sheep

Authors: Roya Bakhtiar, Alireza Abdolmohammadi, Hadi Hajarian, Zahra Nikousefat, Davood, Kalantar-Neyestanaki

Abstract:

The objective of the present study was to determine the polymorphism in the leptin (332G>A) and its association with biometric traits in Sanjabi sheep. For this purpose, blood samples from 96 rams were taken, and tail length, width tail, circumference tail, body length, body width, and height were simultaneously recorded. PCR was performed using specific primer to amplify 463 bp fragment including exon 3 of leptin gene, and PCR products were digested by Cail restriction enzymes. The 332G>A (at 332th nucleotide of exon 3 leptin gene) that caused an amino acid change from Arg to Gln was detected by Cail (CAGNNNCTG) endonuclease, as the endonuclease cannot cut this region if G nucleotide is located in this position. Three genotypes including GG (463), GA (463, 360and 103 bp) and GG (360 bp and 103 bp) were identified after digestion by enzyme. The estimated frequencies of three genotypes including GG, GA, and AA for 332G>A locus were 0.68, 0.29 and 0.03 and those were 0.18 and 0.82 for A and G alleles, respectively. In the current study, chi-square test indicated that 332G>A positions did not deviate from the Hardy–Weinberg (HW) equilibrium. The most important reason to show HW equation was that samples used in this study belong to three large local herds with a traditional breeding system having random mating and without selection. Shannon index amount was calculated which represent an average genetic variation in Sanjabi rams. Also, heterozygosity estimated by Nei index indicated that genetic diversity of mutation in the leptin gene is moderate. Leptin gene polymorphism in the 332G>A had significant effect on body length (P<0.05) trait, and individuals with GA genotype had significantly the higher body length compared to other individuals. Although animals with GA genotype had higher body width, this difference was not statistically significant (P>0.05). This non-synonymous SNP resulted in different amino acid changes at codon positions111(R/Q). As leptin activity is localized, at least in part, in domains between amino acid residues 106-1406, it is speculated that the detected SNP at position 332 may affect the activity of leptin and may lead to different biological functions. Based to our results, due to significant effect of leptin gene polymorphism on body size traits, this gene may be used a candidate gene for improving these traits.

Keywords: body size, Leptin gene, PCR-RFLP, Sanjabi sheep

Procedia PDF Downloads 329
4016 Deep Learning for Recommender System: Principles, Methods and Evaluation

Authors: Basiliyos Tilahun Betru, Charles Awono Onana, Bernabe Batchakui

Abstract:

Recommender systems have become increasingly popular in recent years, and are utilized in numerous areas. Nowadays many web services provide several information for users and recommender systems have been developed as critical element of these web applications to predict choice of preference and provide significant recommendations. With the help of the advantage of deep learning in modeling different types of data and due to the dynamic change of user preference, building a deep model can better understand users demand and further improve quality of recommendation. In this paper, deep neural network models for recommender system are evaluated. Most of deep neural network models in recommender system focus on the classical collaborative filtering user-item setting. Deep learning models demonstrated high level features of complex data can be learned instead of using metadata which can significantly improve accuracy of recommendation. Even though deep learning poses a great impact in various areas, applying the model to a recommender system have not been fully exploited and still a lot of improvements can be done both in collaborative and content-based approach while considering different contextual factors.

Keywords: big data, decision making, deep learning, recommender system

Procedia PDF Downloads 458
4015 Constitutive Flo1p Expression on Strains Bearing Deletions in Genes Involved in Cell Wall Biogenesis

Authors: Lethukuthula Ngobese, Abin Gupthar, Patrick Govender

Abstract:

The ability of yeast cell wall-derived mannoproteins (glycoproteins) to positively contribute to oenological properties has been a key factor that stimulates research initiatives into these industrially important glycoproteins. In addition, and from a fundamental research perspective, yeast cell wall glycoproteins are involved in a wide range of biological interactions. To date, and to the best of our knowledge, our understanding of the fine molecular structure of these mannoproteins is fairly limited. Generally, the amino acid sequences of their protein moieties have been established from structural and functional analysis of the genomic sequence of these yeasts whilst far less information is available on the glycosyl moieties of these mannoproteins. A novel strategy was devised in this study that entails the genetic engineering of yeast strains that over-express and release cell wall-associated glycoproteins into the liquid growth medium. To this end, the Flo1p mannoprotein was overexpressed in Saccharomyces cerevisiae laboratory strains bearing a specific deletion in KNR4 and GPI7 genes involved in cell wall biosynthesis that have been previously shown to extracellularly hyper-secrete cell wall-associated glycoproteins. A polymerase chain reaction (PCR) -based cloning strategy was employed to generate transgenic yeast strains in which the native cell wall FLO1 glycoprotein-encoding gene is brought under transcriptional control of the constitutive PGK1 promoter. The modified Helm’s flocculation assay was employed to assess flocculation intensities of a Flo1p over-expressing wild type and deletion mutant as an indirect measure of their abilities to release the desired mannoprotein. The flocculation intensities of the transformed strains were assessed and all the strains showed similar intensities (>98% flocculation). To assess if mannoproteins were released into the growth medium, the supernatant of each strain was subjected to the BCA protein assay and the transformed Δknr4 strain showed a considerable increase in protein levels. This study has the potential to produce mannoproteins in sufficient quantities that may be employed in future investigations to understand their molecular structures and mechanisms of interaction to the benefit of both fundamental and industrial applications.

Keywords: glycoproteins, genetic engineering, flocculation, over-expression

Procedia PDF Downloads 400
4014 Innovations in the Lithium Chain Value

Authors: Fiúza A., Góis J. Leite M., Braga H., Lima A., Jorge P., Moutela P., Martins L., Futuro A.

Abstract:

Lepidolite is an important lithium mineral that, to the author’s best knowledge, has not been used to produce lithium hydroxide, necessary for energy conversion to electric vehicles. Alkaline leaching of lithium concentrates allows the establishment of a production diagram avoiding most of the environmental drawbacks that are associated with the usage of acid reagents. The tested processes involve a pretreatment by digestion at high temperatures with additives, followed by leaching at hot atmospheric pressure. The solutions obtained must be compatible with solutions from the leaching of spodumene concentrates, allowing the development of a common treatment diagram, an important accomplishment for the feasible exploitation of Portuguese resources. Statistical programming and interpretation techniques are used to minimize the laboratory effort required by conventional approaches and also allow phenomenological comprehension.

Keywords: artificial intelligence, tailings free process, ferroelectric electrolyte battery, life cycle assessment

Procedia PDF Downloads 105
4013 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 171
4012 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems

Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas

Abstract:

This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.

Keywords: transportation networks, freight delivery, data flow, monitoring, e-services

Procedia PDF Downloads 112
4011 Morphological Parameters and Selection of Turkish Edible Seed Pumpkins (Cucurbita pepo L.) Germplasm

Authors: Onder Turkmen, Musa Seymen, Sali Fidan, Mustafa Paksoy

Abstract:

There is a requirement for registered edible seed pumpkin suitable for eating in Turkey. A total of 81 genotypes collected from the researchers in 2005 originated from Eskisehir, Konya, Nevsehir, Tekirdag, Sakarya, Kayseri and Kirsehir provinces were utilized. The used genetic materials were brought to S5 generation by the research groups among 2006 and 2010 years. In this research, S5 stage reached in the genotype given some of the morphological features, and selection of promising genotypes generated scale were made. Results showed that the A-1 (420), A-7 (410), A-8 (420), A-32 (420), B-17 (410), B-24 (410), B-25 (420), B-33 (400), C-24 (420), C-25 (410), C-26 (410) and C-30 (420) genotypes are expected to be promising varieties.

Keywords: candidate cultivar, edible seed pumpkin, morphologic parameters, selection

Procedia PDF Downloads 358
4010 STML: Service Type-Checking Markup Language for Services of Web Components

Authors: Saqib Rasool, Adnan N. Mian

Abstract:

Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.

Keywords: REST, STML, type checking, web component

Procedia PDF Downloads 238
4009 Development of Structural Deterioration Models for Flexible Pavement Using Traffic Speed Deflectometer Data

Authors: Sittampalam Manoharan, Gary Chai, Sanaul Chowdhury, Andrew Golding

Abstract:

The primary objective of this paper is to present a simplified approach to develop the structural deterioration model using traffic speed deflectometer data for flexible pavements. Maintaining assets to meet functional performance is not economical or sustainable in the long terms, and it would end up needing much more investments for road agencies and extra costs for road users. Performance models have to be included for structural and functional predicting capabilities, in order to assess the needs, and the time frame of those needs. As such structural modelling plays a vital role in the prediction of pavement performance. A structural condition is important for the prediction of remaining life and overall health of a road network and also major influence on the valuation of road pavement. Therefore, the structural deterioration model is a critical input into pavement management system for predicting pavement rehabilitation needs accurately. The Traffic Speed Deflectometer (TSD) is a vehicle-mounted Doppler laser system that is capable of continuously measuring the structural bearing capacity of a pavement whilst moving at traffic speeds. The device’s high accuracy, high speed, and continuous deflection profiles are useful for network-level applications such as predicting road rehabilitations needs and remaining structural service life. The methodology adopted in this model by utilizing time series TSD maximum deflection (D0) data in conjunction with rutting, rutting progression, pavement age, subgrade strength and equivalent standard axle (ESA) data. Then, regression analyses were undertaken to establish a correlation equation of structural deterioration as a function of rutting, pavement age, seal age and equivalent standard axle (ESA). This study developed a simple structural deterioration model which will enable to incorporate available TSD structural data in pavement management system for developing network-level pavement investment strategies. Therefore, the available funding can be used effectively to minimize the whole –of- life cost of the road asset and also improve pavement performance. This study will contribute to narrowing the knowledge gap in structural data usage in network level investment analysis and provide a simple methodology to use structural data effectively in investment decision-making process for road agencies to manage aging road assets.

Keywords: adjusted structural number (SNP), maximum deflection (D0), equant standard axle (ESA), traffic speed deflectometer (TSD)

Procedia PDF Downloads 138
4008 Development of a Forecast-Supported Approach for the Continuous Pre-Planning of Mandatory Transportation Capacity for the Design of Sustainable Transport Chains: A Literature Review

Authors: Georg Brunnthaller, Sandra Stein, Wilfried Sihn

Abstract:

Transportation service providers are facing increasing volatility concerning future transport demand. Short-term planning horizons and planning uncertainties lead to reduced capacity utilization and increasing empty mileage. To overcome these challenges, a model is proposed to continuously pre-plan future transportation capacity in order to redesign and adjust the intermodal fleet accordingly. It is expected that the model will enable logistics service providers to organize more economically and ecologically sustainable transport chains in a more flexible way. To further describe these planning aspects, this paper gives an overview on transportation planning problems in a structured way. The focus is on strategic and tactical planning levels, comprising relevant fleet-sizing, service-network-design and choice-of-carriers-problems. Models and their developed solution techniques are presented, and the literature review is concluded with an outlook to our future research directions.

Keywords: freight transportation planning, multimodal, fleet-sizing, service network design, choice of transportation mode, review

Procedia PDF Downloads 300
4007 Social Structure of Corporate Social Responsibility Programme in Pantai Harapan Jaya Village, Bekasi Regency, West Java

Authors: Auliya Adzilatin Uzhma, Ismu Rini Dwi, I. Nyoman Suluh Wijaya

Abstract:

Corporate Social Responsibility (CSR) programme in Pantai Harapan Jaya village is cultivation of mangrove and fishery capital distribution, to achieve the goal the CSR programme needed participation from the society in it. Moeliono in Fahrudin (2011) mentioned that participation from society is based by intrinsic reason from inside people it self and extrinsic reason from the other who related to him. The fundamental connection who caused more boundaries from action which the organization can do called the social structure. The purpose of this research is to know the form of public participation and the social structure typology of the villager and people who is participated in CSR programme. The key actors of the society and key actors of the people who’s participated also can be known. This research use Social Network Analysis method by knew the Rate of Participation, Density and Centrality. The result of the research is people who is involved in the programme is lived in Dusun Pondok Dua and they work in fisheries field. The density value from the participant is 0.516 it’s mean that 51.6% of the people that participated is involved in the same step of CSR programme.

Keywords: social structure, social network analysis, corporate social responsibility, public participation

Procedia PDF Downloads 465
4006 Bilateral Telecontrol of AutoMerlin Mobile Robot Using Time Domain Passivity Control

Authors: Aamir Shahzad, Hubert Roth

Abstract:

This paper is presenting the bilateral telecontrol of AutoMerlin Mobile Robot having communication delay. Passivity Observers has been designed to monitor the net energy at both ports of a two port network and if any or both ports become active making net energy negative, then the passivity controllers dissipate the proper energy to make the overall system passive in the presence of time delay. The environment force is modeled and sent back to human operator so that s/he can feel it and has additional information about the environment in the vicinity of mobile robot. The experimental results have been presented to show the performance and stability of bilateral controller. The results show the whenever the passivity observers observe active behavior then the passivity controller come into action to neutralize the active behavior to make overall system passive.

Keywords: bilateral control, human operator, haptic device, communication network, time domain passivity control, passivity observer, passivity controller, time delay, mobile robot, environment force

Procedia PDF Downloads 371
4005 Examining the Performance of Three Multiobjective Evolutionary Algorithms Based on Benchmarking Problems

Authors: Konstantinos Metaxiotis, Konstantinos Liagkouras

Abstract:

The objective of this study is to examine the performance of three well-known multiobjective evolutionary algorithms for solving optimization problems. The first algorithm is the Non-dominated Sorting Genetic Algorithm-II (NSGA-II), the second one is the Strength Pareto Evolutionary Algorithm 2 (SPEA-2), and the third one is the Multiobjective Evolutionary Algorithms based on decomposition (MOEA/D). The examined multiobjective algorithms are analyzed and tested on the ZDT set of test functions by three performance metrics. The results indicate that the NSGA-II performs better than the other two algorithms based on three performance metrics.

Keywords: MOEAs, multiobjective optimization, ZDT test functions, evolutionary algorithms

Procedia PDF Downloads 450
4004 Networking the Biggest Challenge in Hybrid Cloud Deployment

Authors: Aishwarya Shekhar, Devesh Kumar Srivastava

Abstract:

Cloud computing has emerged as a promising direction for cost efficient and reliable service delivery across data communication networks. The dynamic location of service facilities and the virtualization of hardware and software elements are stressing the communication networks and protocols, especially when data centres are interconnected through the internet. Although the computing aspects of cloud technologies have been largely investigated, lower attention has been devoted to the networking services without involving IT operating overhead. Cloud computing has enabled elastic and transparent access to infrastructure services without involving IT operating overhead. Virtualization has been a key enabler for cloud computing. While resource virtualization and service abstraction have been widely investigated, networking in cloud remains a difficult puzzle. Even though network has significant role in facilitating hybrid cloud scenarios, it hasn't received much attention in research community until recently. We propose Network as a Service (NaaS), which forms the basis of unifying public and private clouds. In this paper, we identify various challenges in adoption of hybrid cloud. We discuss the design and implementation of a cloud platform.

Keywords: cloud computing, networking, infrastructure, hybrid cloud, open stack, naas

Procedia PDF Downloads 405
4003 ‘Nature Will Slow You Down for a Reason’: Virtual Elder-Led Support Services during COVID-19

Authors: Grandmother Roberta Oshkawbewisens, Elder Isabelle Meawasige, Lynne Groulx, Chloë Hamilton, Lee Allison Clark, Dana Hickey, Wansu Qiu, Jared Leedham, Nishanthini Mahendran, Cameron Maclaine

Abstract:

In March of 2020, the world suddenly shifted with the onset of the COVID-19 pandemic; in-person programs and services were unavailable and a scramble to shift to virtual service delivery began. The Native Women’s Association of Canada (NWAC) established virtual programming through the Resiliency Lodge model and connected with Indigenous women, girls, Two-Spirit, transgender, and gender-diverse people across Turtle Island and Inuit Nunangat through programs that provide a safe space to slow down and reflect on their lives, environment, and well-being. To continue to grow the virtual Resiliency Lodge model, NWAC needed to develop an understanding of three questions: how COVID-19 affects Elder-led support services, how Elder-led support services have adapted during the pandemic, and what Wise Practices need to be implemented to continue to develop, refine, and evaluate virtual Elder-led support services specifically for Indigenous women, girls, two-Spirit, transgender, and gender-diverse people. Through funding from the Canadian Institute of Health Research (CIHR), NWAC gained deeper insight into these questions and developed a series of key findings and recommendations that are outlined throughout this report. The goals of this project are to contribute to a more robust participatory analysis that reflects the complexities of Elder-led virtual cultural responses and the impacts of COVID-19 on Elder-led support services; develop culturally and contextually meaningful virtual protocols and wise practices for virtual Indigenous-led support; and develop an Evaluation Strategy to improve the capacity of the Resiliency Lodge model. Significant findings from the project include Resiliency Lodge programs, especially crafting and business sessions, have provided participants with a sense of community and contributed to healing and wellness; Elder-led support services need greater and more stable funding to offer more workshops to more Indigenous women, girls, Two-Spirit, transgender, and gender-diverse people; and Elder- and Indigenous-led programs play a significant role in healing and building a sense of purpose and belonging among Indigenous people. Ultimately, the findings and recommendations outlined in this research project help to guide future Elder-led virtual support services and emphasize the critical need to increase access to Elder-led programming for Indigenous women, girls, Two-Spirit, transgender, and gender-diverse people.

Keywords: indigenous women, traditional healing, virtual programs, covid-19

Procedia PDF Downloads 114
4002 South Asia’s Political Landscape: Precipitating Terrorism

Authors: Saroj Kumar Rath

Abstract:

India's Muslims represent 15 percent of the nation's population, the world's third largest group in any nation after Indonesia and Pakistan. Extremist groups like the Islamic State, Al Qaeda, the Taliban and the Haqqani network increasingly view India as a target. Several trends explain the rise: Terrorism threats in South Asia are linked and mobile - if one source is batted down, jihadists relocate to find another Islamic cause. As NATO withdraws from Afghanistan, some jihadists will eye India. Pakistan regards India as a top enemy and some officials even encourage terrorists to target areas like Kashmir or Mumbai. Meanwhile, a stream of Wahhabi preachers have visited India, offering hard-line messages; extremist groups like Al Qaeda and the Islamic State compete for influence, and militants even pay jihadists. Muslims as a minority population in India could offer fertile ground for the extremist recruiters. This paper argues that there is an urgent need for the Indian government to profile militants and examine social media sites to attack Wahhabi indoctrination while supporting education and entrepreneurship for all of India's citizens.

Keywords: Al Qaeda, terrorism, Islamic state, India, haqqani network, Pakistan, Taliban

Procedia PDF Downloads 599
4001 Human Action Recognition Using Wavelets of Derived Beta Distributions

Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel

Abstract:

In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.

Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet

Procedia PDF Downloads 399
4000 Component-Based Approach in Assessing Sewer Manholes

Authors: Khalid Kaddoura, Tarek Zayed

Abstract:

Sewer networks are constructed to protect the communities and the environment from any contact with the sewer mediums. Pipelines, being laterals or sewer mains, and manholes form the huge underground infrastructure in every urban city. Due to the sewer networks importance, the infrastructure asset management field has extensive advancement in condition assessment and rehabilitation decision models. However, most of the focus was devoted to pipelines giving little attention toward manholes condition assessment. In fact, recent studies started to emerge in this area to preserve manholes from any malfunction. Therefore, the main objective of this study is to propose a condition assessment model for sewer manholes. The model divides the manhole into several components and determines the relative importance weight of each component using the Analytic Network Process (ANP) decision-making method. Later, the condition of the manhole is computed by aggregating the condition of each component with its corresponding weight. Accordingly, the proposed assessment model will enable decision-makers to have a final index suggesting the overall condition of the manhole and a backward analysis to check the condition of each component. Consequently, better decisions are made pertinent to maintenance, rehabilitation, and replacement actions.

Keywords: Analytic Network Process (ANP), condition assessment, decision-making, manholes

Procedia PDF Downloads 334
3999 Cluster-Based Exploration of System Readiness Levels: Mathematical Properties of Interfaces

Authors: Justin Fu, Thomas Mazzuchi, Shahram Sarkani

Abstract:

A key factor in technological immaturity in defense weapons acquisition is lack of understanding critical integrations at the subsystem and component level. To address this shortfall, recent research in integration readiness level (IRL) combines with technology readiness level (TRL) to form a system readiness level (SRL). SRL can be enriched with more robust quantitative methods to provide the program manager a useful tool prior to committing to major weapons acquisition programs. This research harnesses previous mathematical models based on graph theory, Petri nets, and tropical algebra and proposes a modification of the desirable SRL mathematical properties such that a tightly integrated (multitude of interfaces) subsystem can display a lower SRL than an inherently less coupled subsystem. The synthesis of these methods informs an improved decision tool for the program manager to commit to expensive technology development. This research ties the separately developed manufacturing readiness level (MRL) into the network representation of the system and addresses shortfalls in previous frameworks, including the lack of integration weighting and the over-importance of a single extremely immature component. Tropical algebra (based on the minimum of a set of TRLs or IRLs) allows one low IRL or TRL value to diminish the SRL of the entire system, which may not be reflective of actuality if that component is not critical or tightly coupled. Integration connections can be weighted according to importance and readiness levels are modified to be a cardinal scale (based on an analytic hierarchy process). Integration arcs’ importance are dependent on the connected nodes and the additional integrations arcs connected to those nodes. Lack of integration is not represented by zero, but by a perfect integration maturity value. Naturally, the importance (or weight) of such an arc would be zero. To further explore the impact of grouping subsystems, a multi-objective genetic algorithm is then used to find various clusters or communities that can be optimized for the most representative subsystem SRL. This novel calculation is then benchmarked through simulation and using past defense acquisition program data, focusing on the newly introduced Middle Tier of Acquisition (rapidly field prototypes). The model remains a relatively simple, accessible tool, but at higher fidelity and validated with past data for the program manager to decide major defense acquisition program milestones.

Keywords: readiness, maturity, system, integration

Procedia PDF Downloads 72
3998 Improving Cryptographically Generated Address Algorithm in IPv6 Secure Neighbor Discovery Protocol through Trust Management

Authors: M. Moslehpour, S. Khorsandi

Abstract:

As transition to widespread use of IPv6 addresses has gained momentum, it has been shown to be vulnerable to certain security attacks such as those targeting Neighbor Discovery Protocol (NDP) which provides the address resolution functionality in IPv6. To protect this protocol, Secure Neighbor Discovery (SEND) is introduced. This protocol uses Cryptographically Generated Address (CGA) and asymmetric cryptography as a defense against threats on integrity and identity of NDP. Although SEND protects NDP against attacks, it is computationally intensive due to Hash2 condition in CGA. To improve the CGA computation speed, we parallelized CGA generation process and used the available resources in a trusted network. Furthermore, we focused on the influence of the existence of malicious nodes on the overall load of un-malicious ones in the network. According to the evaluation results, malicious nodes have adverse impacts on the average CGA generation time and on the average number of tries. We utilized a Trust Management that is capable of detecting and isolating the malicious node to remove possible incentives for malicious behavior. We have demonstrated the effectiveness of the Trust Management System in detecting the malicious nodes and hence improving the overall system performance.

Keywords: CGA, ICMPv6, IPv6, malicious node, modifier, NDP, overall load, SEND, trust management

Procedia PDF Downloads 173
3997 Estimating the Traffic Impacts of Green Light Optimal Speed Advisory Systems Using Microsimulation

Authors: C. B. Masera, M. Imprialou, L. Budd, C. Morton

Abstract:

Even though signalised intersections are necessary for urban road traffic management, they can act as bottlenecks and disrupt traffic operations. Interrupted traffic flow causes congestion, delays, stop-and-go conditions (i.e. excessive acceleration/deceleration) and longer journey times. Vehicle and infrastructure connectivity offers the potential to provide improved new services with additional functions of assisting drivers. This paper focuses on one of the applications of vehicle-to-infrastructure communication namely Green Light Optimal Speed Advisory (GLOSA). To assess the effectiveness of GLOSA in the urban road network, an integrated microscopic traffic simulation framework is built into VISSIM software. Vehicle movements and vehicle-infrastructure communications are simulated through the interface of External Driver Model. A control algorithm is developed for recommending an optimal speed that is continuously updated in every time step for all vehicles approaching a signal-controlled point. This algorithm allows vehicles to pass a traffic signal without stopping or to minimise stopping times at a red phase. This study is performed with all connected vehicles at 100% penetration rate. Conventional vehicles are also simulated in the same network as a reference. A straight road segment composed of two opposite directions with two traffic lights per lane is studied. The simulation is implemented under 150 vehicles per hour and 200 per hour traffic volume conditions to identify how different traffic densities influence the benefits of GLOSA. The results indicate that traffic flow is improved by the application of GLOSA. According to this study, vehicles passed through the traffic lights more smoothly, and waiting times were reduced by up to 28 seconds. Average delays decreased for the entire network by 86.46% and 83.84% under traffic densities of 150 vehicles per hour per lane and 200 vehicles per hour per lane, respectively.

Keywords: connected vehicles, GLOSA, intelligent transport systems, vehicle-to-infrastructure communication

Procedia PDF Downloads 150
3996 The Scientific Study of the Relationship Between Physicochemical and Microstructural Properties of Ultrafiltered Cheese: Protein Modification and Membrane Separation

Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh

Abstract:

The loss of curd cohesiveness and syneresis are two common problems in the ultrafiltered cheese industry. In this study, by using membrane technology and protein modification, a modified cheese was developed and its properties were compared with a control sample. In order to decrease the lactose content and adjust the protein, acidity, dry matter and milk minerals, a combination of ultrafiltration, nanofiltration and reverse osmosis technologies was employed. For protein modification, a two-stage chemical and enzymatic reaction was employed before and after ultrafiltration. The physicochemical and microstructural properties of the modified ultrafiltered cheese were compared with the control one. Results showed that the modified protein enhanced the functional properties of the final cheese significantly (pvalue< 0.05), even if the protein content was 50% lower than the control one. The modified cheese showed 21 ± 0.70, 18 ± 1.10 & 25±1.65% higher hardness, cohesiveness and water-holding capacity values, respectively, than the control sample. This behavior could be explained by the developed microstructure of the gel network. Furthermore, chemical-enzymatic modification of milk protein induced a significant change in the network parameter of the final cheese. In this way, the indices of network linkage strength, network linkage density, and time scale of junctions were 10.34 ± 0.52, 68.50 ± 2.10 & 82.21 ± 3.85% higher than the control sample, whereas the distance between adjacent linkages was 16.77 ± 1.10% lower than the control sample. These results were supported by the results of the textural analysis. A non-linear viscoelastic study showed a triangle waveform stress of the modified protein contained cheese, while the control sample showed rectangular waveform stress, which suggested a better sliceability of the modified cheese. Moreover, to study the shelf life of the products, the acidity, as well as molds and yeast population, were determined in 120 days. It’s worth mentioning that the lactose content of modified cheese was adjusted at 2.5% before fermentation, while the lactose of the control one was at 4.5%. The control sample showed 8 weeks shelf life, while the shelf life of the modified cheese was 18 weeks in the refrigerator. During 18 weeks, the acidity of modified and control samples increased from 82 ± 1.50 to 94 ± 2.20 °D and 88 ± 1.64 to 194 ± 5.10 °D, respectively. The mold and yeast populations, with time, followed the semicircular shape model (R2 = 0.92, R2adj = 0.89, RMSE = 1.25). Furthermore, the mold and yeast counts and their growth rate in the modified cheese were lower than those for control one; Aforementioned result could be explained by the shortage of the source of energy for the microorganism in the modified cheese. The lactose content of the modified sample was less than 0.2 ± 0.05% at the end of fermentation, while this was 3.7 ± 0.68% in the control sample.

Keywords: non-linear viscoelastic, protein modification, semicircular shape model, ultrafiltered cheese

Procedia PDF Downloads 62
3995 Portfolio Selection with Constraints on Trading Frequency

Authors: Min Dai, Hong Liu, Shuaijie Qian

Abstract:

We study a portfolio selection problem of an investor who faces constraints on rebalancing frequency, which is common in pension fund investment. We formulate it as a multiple optimal stopping problem and utilize the dynamic programming principle. By numerically solving the corresponding Hamilton-Jacobi-Bellman (HJB) equation, we find a series of free boundaries characterizing optimal strategy, and the constraints significantly impact the optimal strategy. Even in the absence of transaction costs, there is a no-trading region, depending on the number of the remaining trading chances. We also find that the equivalent wealth loss caused by the constraints is large. In conclusion, our model clarifies the impact of the constraints on transaction frequency on the optimal strategy.

Keywords: portfolio selection, rebalancing frequency, optimal strategy, free boundary, optimal stopping

Procedia PDF Downloads 67
3994 Evaluation of Tumor Microenvironment Using Molecular Imaging

Authors: Fakhrosadat Sajjadian, Ramin Ghasemi Shayan

Abstract:

The tumor microenvironment plays an fundamental part in tumor start, movement, metastasis, and treatment resistance. It varies from ordinary tissue in terms of its extracellular network, vascular and lymphatic arrange, as well as physiological conditions. The clinical application of atomic cancer imaging is regularly prevented by the tall commercialization costs of focused on imaging operators as well as the constrained clinical applications and little showcase measure of a few operators. . Since numerous cancer types share comparable characteristics of the tumor microenvironment, the capacity to target these biomarkers has the potential to supply clinically translatable atomic imaging advances for numerous types encompassing cancer and broad clinical applications. Noteworthy advance has been made in focusing on the tumor microenvironment for atomic cancer imaging. In this survey, we summarize the standards and methodologies of later progresses in atomic imaging of the tumor microenvironment, utilizing distinctive imaging modalities for early discovery and conclusion of cancer. To conclude, The tumor microenvironment (TME) encompassing tumor cells could be a profoundly energetic and heterogeneous composition of safe cells, fibroblasts, forerunner cells, endothelial cells, flagging atoms and extracellular network (ECM) components.

Keywords: molecular, imaging, TME, medicine

Procedia PDF Downloads 30
3993 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 191
3992 Balancing and Synchronization Control of a Two Wheel Inverted Pendulum Vehicle

Authors: Shiuh-Jer Huang, Shin-Ham Lee, Sheam-Chyun Lin

Abstract:

A two wheel inverted pendulum (TWIP) vehicle is built with two hub DC motors for motion control evaluation. Arduino Nano micro-processor is chosen as the control kernel for this electric test plant. Accelerometer and gyroscope sensors are built in to measure the tilt angle and angular velocity of the inverted pendulum vehicle. Since the TWIP has significantly hub motor dead zone and nonlinear system dynamics characteristics, the vehicle system is difficult to control by traditional model based controller. The intelligent model-free fuzzy sliding mode controller (FSMC) was employed as the main control algorithm. Then, intelligent controllers are designed for TWIP balance control, and two wheels synchronization control purposes.

Keywords: balance control, synchronization control, two-wheel inverted pendulum, TWIP

Procedia PDF Downloads 374
3991 Optimizing Heavy-Duty Green Hydrogen Refueling Stations: A Techno-Economic Analysis of Turbo-Expander Integration

Authors: Christelle Rabbat, Carole Vouebou, Sary Awad, Alan Jean-Marie

Abstract:

Hydrogen has been proven to be a viable alternative to standard fuels as it is easy to produce and only generates water vapour and zero carbon emissions. However, despite the hydrogen benefits, the widespread adoption of hydrogen fuel cell vehicles and internal combustion engine vehicles is impeded by several challenges. The lack of refueling infrastructures remains one of the main hindering factors due to the high costs associated with their design, construction, and operation. Besides, the lack of hydrogen vehicles on the road diminishes the economic viability of investing in refueling infrastructure. Simultaneously, the absence of accessible refueling stations discourages consumers from adopting hydrogen vehicles, perpetuating a cycle of limited market uptake. To address these challenges, the implementation of adequate policies incentivizing the use of hydrogen vehicles and the reduction of the investment and operation costs of hydrogen refueling stations (HRS) are essential to put both investors and customers at ease. Even though the transition to hydrogen cars has been rather slow, public transportation companies have shown a keen interest in this highly promising fuel. Besides, their hydrogen demand is easier to predict and regulate than personal vehicles. Due to the reduced complexity of designing a suitable hydrogen supply chain for public vehicles, this sub-sector could be a great starting point to facilitate the adoption of hydrogen vehicles. Consequently, this study will focus on designing a chain of on-site green HRS for the public transportation network in Nantes Metropole leveraging the latest relevant technological advances aiming to reduce the costs while ensuring reliability, safety, and ease of access. To reduce the cost of HRS and encourage their widespread adoption, a network of 7 H35-T40 HRS has been designed, replacing the conventional J-T valves with turbo-expanders. Each station in the network has a daily capacity of 1,920 kg. Thus, the HRS network can produce up to 12.5 tH2 per day. The detailed cost analysis has revealed a CAPEX per station of 16.6 M euros leading to a network CAPEX of 116.2 M euros. The proposed station siting prioritized Nantes metropole’s 5 bus depots and included 2 city-centre locations. Thanks to the turbo-expander technology, the cooling capacity of the proposed HRS is 19% lower than that of a conventional station equipped with J-T valves, resulting in significant CAPEX savings estimated at 708,560 € per station, thus nearly 5 million euros for the whole HRS network. Besides, the turbo-expander power generation ranges from 7.7 to 112 kW. Thus, the power produced can be used within the station or sold as electricity to the main grid, which would, in turn, maximize the station’s profit. Despite the substantial initial investment required, the environmental benefits, cost savings, and energy efficiencies realized through the transition to hydrogen fuel cell buses and the deployment of HRS equipped with turbo-expanders offer considerable advantages for both TAN and Nantes Metropole. These initiatives underscore their enduring commitment to fostering green mobility and combatting climate change in the long term.

Keywords: green hydrogen, refueling stations, turbo-expander, heavy-duty vehicles

Procedia PDF Downloads 30
3990 Second Order Cone Optimization Approach to Two-stage Network DEA

Authors: K. Asanimoghadam, M. Salahi, A. Jamalian

Abstract:

Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.

Keywords: network DEA, conic optimization, undesirable output, SBM

Procedia PDF Downloads 181