Search results for: returning-oriented programming attacks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1518

Search results for: returning-oriented programming attacks

738 Innovations in the Lithium Chain Value

Authors: Fiúza A., Góis J. Leite M., Braga H., Lima A., Jorge P., Moutela P., Martins L., Futuro A.

Abstract:

Lepidolite is an important lithium mineral that, to the author’s best knowledge, has not been used to produce lithium hydroxide, necessary for energy conversion to electric vehicles. Alkaline leaching of lithium concentrates allows the establishment of a production diagram avoiding most of the environmental drawbacks that are associated with the usage of acid reagents. The tested processes involve a pretreatment by digestion at high temperatures with additives, followed by leaching at hot atmospheric pressure. The solutions obtained must be compatible with solutions from the leaching of spodumene concentrates, allowing the development of a common treatment diagram, an important accomplishment for the feasible exploitation of Portuguese resources. Statistical programming and interpretation techniques are used to minimize the laboratory effort required by conventional approaches and also allow phenomenological comprehension.

Keywords: artificial intelligence, tailings free process, ferroelectric electrolyte battery, life cycle assessment

Procedia PDF Downloads 117
737 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 187
736 STML: Service Type-Checking Markup Language for Services of Web Components

Authors: Saqib Rasool, Adnan N. Mian

Abstract:

Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.

Keywords: REST, STML, type checking, web component

Procedia PDF Downloads 249
735 ‘Nature Will Slow You Down for a Reason’: Virtual Elder-Led Support Services during COVID-19

Authors: Grandmother Roberta Oshkawbewisens, Elder Isabelle Meawasige, Lynne Groulx, Chloë Hamilton, Lee Allison Clark, Dana Hickey, Wansu Qiu, Jared Leedham, Nishanthini Mahendran, Cameron Maclaine

Abstract:

In March of 2020, the world suddenly shifted with the onset of the COVID-19 pandemic; in-person programs and services were unavailable and a scramble to shift to virtual service delivery began. The Native Women’s Association of Canada (NWAC) established virtual programming through the Resiliency Lodge model and connected with Indigenous women, girls, Two-Spirit, transgender, and gender-diverse people across Turtle Island and Inuit Nunangat through programs that provide a safe space to slow down and reflect on their lives, environment, and well-being. To continue to grow the virtual Resiliency Lodge model, NWAC needed to develop an understanding of three questions: how COVID-19 affects Elder-led support services, how Elder-led support services have adapted during the pandemic, and what Wise Practices need to be implemented to continue to develop, refine, and evaluate virtual Elder-led support services specifically for Indigenous women, girls, two-Spirit, transgender, and gender-diverse people. Through funding from the Canadian Institute of Health Research (CIHR), NWAC gained deeper insight into these questions and developed a series of key findings and recommendations that are outlined throughout this report. The goals of this project are to contribute to a more robust participatory analysis that reflects the complexities of Elder-led virtual cultural responses and the impacts of COVID-19 on Elder-led support services; develop culturally and contextually meaningful virtual protocols and wise practices for virtual Indigenous-led support; and develop an Evaluation Strategy to improve the capacity of the Resiliency Lodge model. Significant findings from the project include Resiliency Lodge programs, especially crafting and business sessions, have provided participants with a sense of community and contributed to healing and wellness; Elder-led support services need greater and more stable funding to offer more workshops to more Indigenous women, girls, Two-Spirit, transgender, and gender-diverse people; and Elder- and Indigenous-led programs play a significant role in healing and building a sense of purpose and belonging among Indigenous people. Ultimately, the findings and recommendations outlined in this research project help to guide future Elder-led virtual support services and emphasize the critical need to increase access to Elder-led programming for Indigenous women, girls, Two-Spirit, transgender, and gender-diverse people.

Keywords: indigenous women, traditional healing, virtual programs, covid-19

Procedia PDF Downloads 135
734 The Proactive Approach of Digital Forensics Methodology against Targeted Attack Malware

Authors: Mohamed Fadzlee Sulaiman, Mohd Zabri Adil Talib, Aswami Fadillah Mohd Ariffin

Abstract:

Each individual organization has their own mechanism to build up cyber defense capability in protecting their information infrastructures from data breaches and cyber espionage. But, we can not deny the possibility of failing to detect and stop cyber attacks especially for those targeting credential information and intellectual property (IP). In this paper, we would like to share the modern approach of effective digital forensic methodology in order to identify the artifacts in tracing the trails of evidence while mitigating the infection from the target machine/s. This proposed approach will suit the digital forensic investigation to be conducted while resuming the business critical operation after mitigating the infection and minimizing the risk from the identified attack to transpire. Therefore, traditional digital forensics methodology has to be improvised to be proactive which not only focusing to discover the root caused and the threat actor but to develop the relevant mitigation plan in order to prevent from the same attack.

Keywords: digital forensic, detection, eradication, targeted attack, malware

Procedia PDF Downloads 269
733 Portfolio Selection with Constraints on Trading Frequency

Authors: Min Dai, Hong Liu, Shuaijie Qian

Abstract:

We study a portfolio selection problem of an investor who faces constraints on rebalancing frequency, which is common in pension fund investment. We formulate it as a multiple optimal stopping problem and utilize the dynamic programming principle. By numerically solving the corresponding Hamilton-Jacobi-Bellman (HJB) equation, we find a series of free boundaries characterizing optimal strategy, and the constraints significantly impact the optimal strategy. Even in the absence of transaction costs, there is a no-trading region, depending on the number of the remaining trading chances. We also find that the equivalent wealth loss caused by the constraints is large. In conclusion, our model clarifies the impact of the constraints on transaction frequency on the optimal strategy.

Keywords: portfolio selection, rebalancing frequency, optimal strategy, free boundary, optimal stopping

Procedia PDF Downloads 80
732 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 204
731 A Design System for Complex Profiles of Machine Members Using a Synthetic Curve

Authors: N. Sateesh, C. S. P. Rao, K. Satyanarayana, C. Rajashekar

Abstract:

This paper proposes a development of a CAD/CAM system for complex profiles of various machine members using a synthetic curve i.e. B-spline. Conventional methods in designing and manufacturing of complex profiles are tedious and time consuming. Even programming those on a computer numerical control (CNC) machine can be a difficult job because of the complexity of the profiles. The system developed provides graphical and numerical representation B-spline profile for any given input. In this paper, the system is applicable to represent a cam profile with B-spline and attempt is made to improve the follower motion.

Keywords: plate-cams, cam profile, b-spline, computer numerical control (CNC), computer aided design and computer aided manufacturing (CAD/CAM), R-D-R-D (rise-dwell-return-dwell)

Procedia PDF Downloads 608
730 Risk of Plastic Shrinkage Cracking in Recycled Aggregate Concrete

Authors: M. Eckert, M. Oliveira

Abstract:

The intensive use of natural aggregates, near cities and towns, associated to the increase of the global population, leads to its depletion and increases the transport distances. The uncontrolled deposition of construction and demolition waste in landfills and city outskirts, causes pollution and takes up space. The use of recycled aggregates in concrete preparation would contribute to mitigate the problem. However, it arises the problem that the high water absorption of recycled aggregate decreases the bleeding rate of concrete, and when this gets lower than the evaporation rate, plastic shrinkage cracking occurs. This phenomenon can be particularly problematic in hot and windy curing environments. Cracking facilitates the flow of liquid and gas into concrete which attacks the reinforcement and degrades the concrete. These factors reduce the durability of concrete structures and consequently the lifetime of buildings. A ring test was used, cured in a wind tunnel, to evaluate the plastic shrinkage cracking sensitivity of recycled aggregate concrete, in order to implement preventive means to control this phenomenon. The role of several aggregate properties on the concrete segregation and cracking mechanisms were also discussed.

Keywords: recycled aggregate, plastic shrinkage cracking, wind tunnel, durability

Procedia PDF Downloads 418
729 Studying Relationship between Local Geometry of Decision Boundary with Network Complexity for Robustness Analysis with Adversarial Perturbations

Authors: Tushar K. Routh

Abstract:

If inputs are engineered in certain manners, they can influence deep neural networks’ (DNN) performances by facilitating misclassifications, a phenomenon well-known as adversarial attacks that question networks’ vulnerability. Recent studies have unfolded the relationship between vulnerability of such networks with their complexity. In this paper, the distinctive influence of additional convolutional layers at the decision boundaries of several DNN architectures was investigated. Here, to engineer inputs from widely known image datasets like MNIST, Fashion MNIST, and Cifar 10, we have exercised One Step Spectral Attack (OSSA) and Fast Gradient Method (FGM) techniques. The aftermaths of adding layers to the robustness of the architectures have been analyzed. For reasoning, separation width from linear class partitions and local geometry (curvature) near the decision boundary have been examined. The result reveals that model complexity has significant roles in adjusting relative distances from margins, as well as the local features of decision boundaries, which impact robustness.

Keywords: DNN robustness, decision boundary, local curvature, network complexity

Procedia PDF Downloads 71
728 A Fuzzy Linear Regression Model Based on Dissemblance Index

Authors: Shih-Pin Chen, Shih-Syuan You

Abstract:

Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.

Keywords: dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming

Procedia PDF Downloads 431
727 Contemporary Arabic Novel Probing the Self and the Other: A Contrapuntal Study of Identity, Sexuality, and Fundamentalism

Authors: Jihan Mahmoud

Abstract:

This paper examines the role played by Arabic novelists in revolutionary change in the Arab world, discussing themes of identity, sexuality and fundamentalism as portrayed in a selection of modern and contemporary Arabic novels that are either written in English or translated from Arabic into English. It particularly focuses on the post-Naguib Mahfouz era. Taking my cue from the current political changes in the Arab world, starting with 9/11/ terrorist attacks in the USA and the UK, the ‘Arab Spring’ revolutions, the rise of political Islam and the emergence of Isis, the Islamic state in Iraq and the Levant, the study analyses the differences in the ways contemporary Arab novelists from different Arabic countries represent the interaction between identity, sexual politics and fundamentalist ideas in the Arab world, with a specific focus on the overlap between literature, religion and international politics in the region. It argues that the post-Mahfouz era marked a new phase in the development of the political Arabic novel not only as a force of resistance against political-religious oppression, but as a call for revolution as well. Thus, the Arabic novel reshapes values and prompts future action.

Keywords: Arabic novel, Islam, politics, sexuality

Procedia PDF Downloads 523
726 Measuring Energy Efficiency Performance of Mena Countries

Authors: Azam Mohammadbagheri, Bahram Fathi

Abstract:

DEA has become a very popular method of performance measure, but it still suffers from some shortcomings. One of these shortcomings is the issue of having multiple optimal solutions to weights for efficient DMUs. The cross efficiency evaluation as an extension of DEA is proposed to avoid this problem. Lam (2010) is also proposed a mixed-integer linear programming formulation based on linear discriminate analysis and super efficiency method (MILP model) to avoid having multiple optimal solutions to weights. In this study, we modified MILP model to determine more suitable weight sets and also evaluate the energy efficiency of MENA countries as an application of the proposed model.

Keywords: data envelopment analysis, discriminate analysis, cross efficiency, MILP model

Procedia PDF Downloads 683
725 Developing a Systems Dynamics Model for Security Management

Authors: Kuan-Chou Chen

Abstract:

This paper will demonstrate a simulation model of an information security system by using the systems dynamic approach. The relationships in the system model are designed to be simple and functional and do not necessarily represent any particular information security environments. The purpose of the paper aims to develop a generic system dynamic information security system model with implications on information security research. The interrelated and interdependent relationships of five primary sectors in the system dynamic model will be presented in this paper. The integrated information security systems model will include (1) information security characteristics, (2) users, (3) technology, (4) business functions, and (5) policy and management. Environments, attacks, government and social culture will be defined as the external sector. The interactions within each of these sectors will be depicted by system loop map as well. The proposed system dynamic model will not only provide a conceptual framework for information security analysts and designers but also allow information security managers to remove the incongruity between the management of risk incidents and the management of knowledge and further support information security managers and decision makers the foundation for managerial actions and policy decisions.

Keywords: system thinking, information security systems, security management, simulation

Procedia PDF Downloads 421
724 The Applicability of International Humanitarian Law to Non-State Actors

Authors: Yin Cheung Lam

Abstract:

In 1949, the ratification of the Geneva Conventions heralded the international community’s adoption of a new universal and non-discriminatory approach to human rights in situations of conflict. However, with the proliferation of international terrorism after the 9/11 attacks on the United States (U.S.), the international community’s uneven and contradictory implementations of international humanitarian law (IHL) questioned its agenda of universal human rights. Specifically, the derogation from IHL has never been so pronounced in the U.S. led ‘War on Terror’. While an extensive literature has ‘assessed the impact’ of the implementation of the Geneva Conventions, limited attention has been paid to interrogating the ways in which the Geneva Conventions and its resulting implementation have functioned to discursively reproduce certain understandings of human rights between states and non-state actors. Through a discursive analysis of the Geneva Conventions and the conceptualization of human rights in relation to terrorism, this thesis problematises the way in which the U.S. has understood and reproduced understandings of human rights. Using the U.S. ‘War on Terror’ as an example, it seeks to extend previous analyses of the U.S.’ practice of IHL through a qualitative discursive analysis of the human rights content that appears in the Geneva Conventions in addition to the speeches and policy documents on the ‘War on Terror’.

Keywords: discursive analysis, human rights, non-state actors, war on terror

Procedia PDF Downloads 602
723 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms

Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy

Abstract:

Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.

Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu

Procedia PDF Downloads 478
722 Comparative Analysis of Simulation-Based and Mixed-Integer Linear Programming Approaches for Optimizing Building Modernization Pathways Towards Decarbonization

Authors: Nico Fuchs, Fabian Wüllhorst, Laura Maier, Dirk Müller

Abstract:

The decarbonization of building stocks necessitates the modernization of existing buildings. Key measures for this include reducing energy demands through insulation of the building envelope, replacing heat generators, and installing solar systems. Given limited financial resources, it is impractical to modernize all buildings in a portfolio simultaneously; instead, prioritization of buildings and modernization measures for a given planning horizon is essential. Optimization models for modernization pathways can assist portfolio managers in this prioritization. However, modeling and solving these large-scale optimization problems, often represented as mixed-integer problems (MIP), necessitates simplifying the operation of building energy systems particularly with respect to system dynamics and transient behavior. This raises the question of which level of simplification remains sufficient to accurately account for realistic costs and emissions of building energy systems, ensuring a fair comparison of different modernization measures. This study addresses this issue by comparing a two-stage simulation-based optimization approach with a single-stage mathematical optimization in a mixed-integer linear programming (MILP) formulation. The simulation-based approach serves as a benchmark for realistic energy system operation but requires a restriction of the solution space to discrete choices of modernization measures, such as the sizing of heating systems. After calculating the operation of different energy systems in terms of the resulting final energy demands in simulation models on a first stage, the results serve as input for a second stage MILP optimization, where the design of each building in the portfolio is optimized. In contrast to the simulation-based approach, the MILP-based approach can capture a broader variety of modernization measures due to the efficiency of MILP solvers but necessitates simplifying the building energy system operation. Both approaches are employed to determine the cost-optimal design and dimensioning of several buildings in a portfolio to meet climate targets within limited yearly budgets, resulting in a modernization pathway for the entire portfolio. The comparison reveals that the MILP formulation successfully captures design decisions of building energy systems, such as the selection of heating systems and the modernization of building envelopes. However, the results regarding the optimal dimensioning of heating technologies differ from the results of the two-stage simulation-based approach, as the MILP model tends to overestimate operational efficiency, highlighting the limitations of the MILP approach.

Keywords: building energy system optimization, model accuracy in optimization, modernization pathways, building stock decarbonization

Procedia PDF Downloads 24
721 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 278
720 Evaluation of Rehabilitation in Ischemic Stroke

Authors: Amirmohammad Dahouri

Abstract:

Each year, more than 795,000 individuals in the United States grieve a stroke, and by 2030, it is predictable that 4% of the U.S. people will have had a stroke. Ischemic stroke, accounting for about 80% of all strokes, is one of the main causes of disability. The goal of stroke rehabilitation is to help patients return to physical and mental functions and relearn the required aids to living everyday life. This flagging has an adverse effect on patients’ quality of life and affects their daily living activities. In recent years, the rehabilitation of ischemic stroke attractions more attention in the world. A review of the rudimentary perceptions of stroke rehabilitation that are price stressing to all specialists who delicacy patients with stroke. Ideas are made for patients on how to functionally manage daily activities after they have qualified for a stroke. It is vital for home healthcare clinicians to understand the process from acute events to medical equilibrium and rehabilitation to adaptation. Different sources such as Pub Med Google Scholar and science direct have been used and various contemporary articles in this era have been analyzed. The care plan must also foundation actual actions to protect against recurrent stroke, as stroke patients are generally at significant risk for further ischemic or hemorrhagic attacks. Here, we review evidence of rehabilitation in treating post-stroke impairment.

Keywords: rehabilitation, stroke, ischemic, hemorrhagic, brain

Procedia PDF Downloads 157
719 Airport Investment Risk Assessment under Uncertainty

Authors: Elena M. Capitanul, Carlos A. Nunes Cosenza, Walid El Moudani, Felix Mora Camino

Abstract:

The construction of a new airport or the extension of an existing one requires massive investments and many times public private partnerships were considered in order to make feasible such projects. One characteristic of these projects is uncertainty with respect to financial and environmental impacts on the medium to long term. Another one is the multistage nature of these types of projects. While many airport development projects have been a success, some others have turned into a nightmare for their promoters. This communication puts forward a new approach for airport investment risk assessment. The approach takes explicitly into account the degree of uncertainty in activity levels prediction and proposes milestones for the different stages of the project for minimizing risk. Uncertainty is represented through fuzzy dual theory and risk management is performed using dynamic programming. An illustration of the proposed approach is provided.

Keywords: airports, fuzzy logic, risk, uncertainty

Procedia PDF Downloads 405
718 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects

Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town

Abstract:

The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.

Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry

Procedia PDF Downloads 87
717 Electrical Load Estimation Using Estimated Fuzzy Linear Parameters

Authors: Bader Alkandari, Jamal Y. Madouh, Ahmad M. Alkandari, Anwar A. Alnaqi

Abstract:

A new formulation of fuzzy linear estimation problem is presented. It is formulated as a linear programming problem. The objective is to minimize the spread of the data points, taking into consideration the type of the membership function of the fuzzy parameters to satisfy the constraints on each measurement point and to insure that the original membership is included in the estimated membership. Different models are developed for a fuzzy triangular membership. The proposed models are applied to different examples from the area of fuzzy linear regression and finally to different examples for estimating the electrical load on a busbar. It had been found that the proposed technique is more suited for electrical load estimation, since the nature of the load is characterized by the uncertainty and vagueness.

Keywords: fuzzy regression, load estimation, fuzzy linear parameters, electrical load estimation

Procedia PDF Downloads 533
716 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning

Authors: Chandan Hegde, K. Ashwini

Abstract:

Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.

Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning

Procedia PDF Downloads 182
715 Some Pertinent Issues and Considerations on CBSE

Authors: Anil Kumar Tripathi, Ratneshwer Gupta

Abstract:

All the software engineering researches and best industry practices aim at providing software products with high degree of quality and functionality at low cost and less time. These requirements are addressed by the Component Based Software Engineering (CBSE) as well. CBSE, which deals with the software construction by components’ assembly, is a revolutionary extension of Software Engineering. CBSE must define and describe processes to assure timely completion of high quality software systems that are composed of a variety of pre built software components. Though these features provide distinct and visible benefits in software design and programming, they also raise some challenging problems. The aim of this work is to summarize the pertinent issues and considerations in CBSE to make an understanding in forms of concepts and observations that may lead to development of newer ways of dealing with the problems and challenges in CBSE.

Keywords: software component, component based software engineering, software process, testing, maintenance

Procedia PDF Downloads 394
714 Umbrella Reinforcement Learning – A Tool for Hard Problems

Authors: Egor E. Nuzhin, Nikolay V. Brilliantov

Abstract:

We propose an approach for addressing Reinforcement Learning (RL) problems. It combines the ideas of umbrella sampling, borrowed from Monte Carlo technique of computational physics and chemistry, with optimal control methods, and is realized on the base of neural networks. This results in a powerful algorithm, designed to solve hard RL problems – the problems, with long-time delayed reward, state-traps sticking and a lack of terminal states. It outperforms the prominent algorithms, such as PPO, RND, iLQR and VI, which are among the most efficient for the hard problems. The new algorithm deals with a continuous ensemble of agents and expected return, that includes the ensemble entropy. This results in a quick and efficient search of the optimal policy in terms of ”exploration-exploitation trade-off” in the state-action space.

Keywords: umbrella sampling, reinforcement learning, policy gradient, dynamic programming

Procedia PDF Downloads 4
713 Mathematical Modeling and Algorithms for the Capacitated Facility Location and Allocation Problem with Emission Restriction

Authors: Sagar Hedaoo, Fazle Baki, Ahmed Azab

Abstract:

In supply chain management, network design for scalable manufacturing facilities is an emerging field of research. Facility location allocation assigns facilities to customers to optimize the overall cost of the supply chain. To further optimize the costs, capacities of these facilities can be changed in accordance with customer demands. A mathematical model is formulated to fully express the problem at hand and to solve small-to-mid range instances. A dedicated constraint has been developed to restrict emissions in line with the Kyoto protocol. This problem is NP-Hard; hence, a simulated annealing metaheuristic has been developed to solve larger instances. A case study on the USA-Canada cross border crossing is used.

Keywords: emission, mixed integer linear programming, metaheuristic, simulated annealing

Procedia PDF Downloads 303
712 VANETs: Security Challenges and Future Directions

Authors: Jared Oluoch

Abstract:

Connected vehicles are equipped with wireless sensors that aid in Vehicle to Vehicle (V2V) and Vehicle to Infrastructure (V2I) communication. These vehicles will in the near future provide road safety, improve transport efficiency, and reduce traffic congestion. One of the challenges for connected vehicles is how to ensure that information sent across the network is secure. If security of the network is not guaranteed, several attacks can occur, thereby compromising the robustness, reliability, and efficiency of the network. This paper discusses existing security mechanisms and unique properties of connected vehicles. The methodology employed in this work is exploratory. The paper reviews existing security solutions for connected vehicles. More concretely, it discusses various cryptographic mechanisms available, and suggests areas of improvement. The study proposes a combination of symmetric key encryption and public key cryptography to improve security. The study further proposes message aggregation as a technique to overcome message redundancy. This paper offers a comprehensive overview of connected vehicles technology, its applications, its security mechanisms, open challenges, and potential areas of future research.

Keywords: VANET, connected vehicles, 802.11p, WAVE, DSRC, trust, security, cryptography

Procedia PDF Downloads 309
711 Intelligent Rescheduling Trains for Air Pollution Management

Authors: Kainat Affrin, P. Reshma, G. Narendra Kumar

Abstract:

Optimization of timetable is the need of the day for the rescheduling and routing of trains in real time. Trains are scheduled in parallel with the road transport vehicles to the same destination. As the number of trains is restricted due to single track, customers usually opt for road transport to use frequently. The air pollution increases as the density of vehicles on road transport is increased. Use of an alternate mode of transport like train helps in reducing air-pollution. This paper mainly aims at attracting the passengers to Train transport by proper rescheduling of trains using hybrid of stop-skip algorithm and iterative convex programming algorithm. Rescheduling of train bi-directionally is achieved on a single track with dynamic dual time and varying stops. Introduction of more trains attract customers to use rail transport frequently, thereby decreasing the pollution. The results are simulated using Network Simulator (NS-2).

Keywords: air pollution, AODV, re-scheduling, WSNs

Procedia PDF Downloads 357
710 Object-Oriented Program Comprehension by Identification of Software Components and Their Connexions

Authors: Abdelhak-Djamel Seriai, Selim Kebir, Allaoua Chaoui

Abstract:

During the last decades, object oriented program- ming has been massively used to build large-scale systems. However, evolution and maintenance of such systems become a laborious task because of the lack of object oriented programming to offer a precise view of the functional building blocks of the system. This lack is caused by the fine granularity of classes and objects. In this paper, we use a post object-oriented technology namely software components, to propose an approach based on the identification of the functional building blocks of an object oriented system by analyzing its source code. These functional blocks are specified as software components and the result is a multi-layer component based software architecture.

Keywords: software comprehension, software component, object oriented, software architecture, reverse engineering

Procedia PDF Downloads 408
709 A Hybrid Digital Watermarking Scheme

Authors: Nazish Saleem Abbas, Muhammad Haris Jamil, Hamid Sharif

Abstract:

Digital watermarking is a technique that allows an individual to add and hide secret information, copyright notice, or other verification message inside a digital audio, video, or image. Today, with the advancement of technology, modern healthcare systems manage patients’ diagnostic information in a digital way in many countries. When transmitted between hospitals through the internet, the medical data becomes vulnerable to attacks and requires security and confidentiality. Digital watermarking techniques are used in order to ensure the authenticity, security and management of medical images and related information. This paper proposes a watermarking technique that embeds a watermark in medical images imperceptibly and securely. In this work, digital watermarking on medical images is carried out using the Least Significant Bit (LSB) with the Discrete Cosine Transform (DCT). The proposed methods of embedding and extraction of a watermark in a watermarked image are performed in the frequency domain using LSB by XOR operation. The quality of the watermarked medical image is measured by the Peak signal-to-noise ratio (PSNR). It was observed that the watermarked medical image obtained performing XOR operation between DCT and LSB survived compression attack having a PSNR up to 38.98.

Keywords: watermarking, image processing, DCT, LSB, PSNR

Procedia PDF Downloads 44