Search results for: secure hashing algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4199

Search results for: secure hashing algorithm

2279 Boundary Motion by Curvature: Accessible Modeling of Oil Spill Evaporation/Dissipation

Authors: Gary Miller, Andriy Didenko, David Allison

Abstract:

The boundary of a region in the plane shrinks according to its curvature. A simple algorithm based upon this motion by curvature performed by a spreadsheet simulates the evaporation/dissipation behavior of oil spill boundaries.

Keywords: mathematical modeling, oil, evaporation, dissipation, boundary

Procedia PDF Downloads 506
2278 Very Large Scale Integration Architecture of Finite Impulse Response Filter Implementation Using Retiming Technique

Authors: S. Jalaja, A. M. Vijaya Prakash

Abstract:

Recursive combination of an algorithm based on Karatsuba multiplication is exploited to design a generalized transpose and parallel Finite Impulse Response (FIR) Filter. Mid-range Karatsuba multiplication and Carry Save adder based on Karatsuba multiplication reduce time complexity for higher order multiplication implemented up to n-bit. As a result, we design modified N-tap Transpose and Parallel Symmetric FIR Filter Structure using Karatsuba algorithm. The mathematical formulation of the FFA Filter is derived. The proposed architecture involves significantly less area delay product (APD) then the existing block implementation. By adopting retiming technique, hardware cost is reduced further. The filter architecture is designed by using 90 nm technology library and is implemented by using cadence EDA Tool. The synthesized result shows better performance for different word length and block size. The design achieves switching activity reduction and low power consumption by applying with and without retiming for different combination of the circuit. The proposed structure achieves more than a half of the power reduction by adopting with and without retiming techniques compared to the earlier design structure. As a proof of the concept for block size 16 and filter length 64 for CKA method, it achieves a 51% as well as 70% less power by applying retiming technique, and for CSA method it achieves a 57% as well as 77% less power by applying retiming technique compared to the previously proposed design.

Keywords: carry save adder Karatsuba multiplication, mid range Karatsuba multiplication, modified FFA and transposed filter, retiming

Procedia PDF Downloads 229
2277 Application of Modal Analysis for Commissioning of a Ball Screw System

Authors: T. D. Tran, H. Schlegel, R. Neugebauer

Abstract:

Ball screws are an important component in machine tools. In mechatronic systems and machine tools, a ball screw has to work usually at a high speed. Otherwise the axial compliance of the ball screw, in combination with the inertia of the slide, the motor, the coupling and the screw, will cause an oscillation resonance, which limits the systems bandwidth and consequently influences performance of the motion controller. In this paper, the modal analysis method by measuring and analysing the vibrating parameters of the ball screw system to determine the dynamic characteristic of existing structures is used. On the one hand, the results of this study were obtained by the theoretical analysis and the modal testing of a ball screw system test station with the help of an impact hammer, respectively using excitation by motor. The experimental study showed oscillating forms of the ball screw for each frequency and obtained eigenfrequencies of the ball screw system. On the other hand, in this research a simulation with the help of the numerical modal analysis in order to analyse the oscillation and to find the eigenfrequencies of the ball screw system is used. Furthermore, the model order reduction by modal reduction and also according to Guyan is carried out. On the basis of these results a secure and also rapid commissioning of the control loops with regard to operating in their optimal function is targeted.

Keywords: modal analysis, ball screw, controller system, machine tools

Procedia PDF Downloads 455
2276 Research Analysis of Urban Area Expansion Based on Remote Sensing

Authors: Sheheryar Khan, Weidong Li, Fanqian Meng

Abstract:

The Urban Heat Island (UHI) effect is one of the foremost problems out of other ecological and socioeconomic issues in urbanization. Due to this phenomenon that human-made urban areas have replaced the rural landscape with the surface that increases thermal conductivity and urban warmth; as a result, the temperature in the city is higher than in the surrounding rural areas. To affect the evidence of this phenomenon in the Zhengzhou city area, an observation of the temperature variations in the urban area is done through a scientific method that has been followed. Landsat 8 satellite images were taken from 2013 to 2015 to calculate the effect of Urban Heat Island (UHI) along with the NPP-VRRIS night-time remote sensing data to analyze the result for a better understanding of the center of the built-up area. To further support the evidence, the correlation between land surface temperatures and the normalized difference vegetation index (NDVI) was calculated using the Red band 4 and Near-infrared band 5 of the Landsat 8 data. Mono-window algorithm was applied to retrieve the land surface temperature (LST) distribution from the Landsat 8 data using Band 10 and 11 accordingly to convert the top-of-atmosphere radiance (TOA) and to convert the satellite brightness temperature. Along with Landsat 8 data, NPP-VIIRS night-light data is preprocessed to get the research area data. The analysis between Landsat 8 data and NPP night-light data was taken to compare the output center of the Built-up area of Zhengzhou city.

Keywords: built-up area, land surface temperature, mono-window algorithm, NDVI, remote sensing, threshold method, Zhengzhou

Procedia PDF Downloads 134
2275 Applying Different Stenography Techniques in Cloud Computing Technology to Improve Cloud Data Privacy and Security Issues

Authors: Muhammad Muhammad Suleiman

Abstract:

Cloud Computing is a versatile concept that refers to a service that allows users to outsource their data without having to worry about local storage issues. However, the most pressing issues to be addressed are maintaining a secure and reliable data repository rather than relying on untrustworthy service providers. In this study, we look at how stenography approaches and collaboration with Digital Watermarking can greatly improve the system's effectiveness and data security when used for Cloud Computing. The main requirement of such frameworks, where data is transferred or exchanged between servers and users, is safe data management in cloud environments. Steganography is the cloud is among the most effective methods for safe communication. Steganography is a method of writing coded messages in such a way that only the sender and recipient can safely interpret and display the information hidden in the communication channel. This study presents a new text steganography method for hiding a loaded hidden English text file in a cover English text file to ensure data protection in cloud computing. Data protection, data hiding capability, and time were all improved using the proposed technique.

Keywords: cloud computing, steganography, information hiding, cloud storage, security

Procedia PDF Downloads 182
2274 A Comparative Study of the Maximum Power Point Tracking Methods for PV Systems Using Boost Converter

Authors: M. Doumi, A. Miloudi, A.G. Aissaoui, K. Tahir, C. Belfedal, S. Tahir

Abstract:

The studies on the photovoltaic system are extensively increasing because of a large, secure, essentially exhaustible and broadly available resource as a future energy supply. However, the output power induced in the photovoltaic modules is influenced by an intensity of solar cell radiation, temperature of the solar cells and so on. Therefore, to maximize the efficiency of the photovoltaic system, it is necessary to track the maximum power point of the PV array, for this Maximum Power Point Tracking (MPPT) technique is used. These algorithms are based on the Perturb-Observe, Conductance-Increment and the Fuzzy Logic methods. These techniques vary in many aspects as: simplicity, convergence speed, digital or analogical implementation, sensors required, cost, range of effectiveness, and in other aspects. This paper presents a comparative study of three widely-adopted MPPT algorithms; their performance is evaluated on the energy point of view, by using the simulation tool Simulink®, considering different solar irradiance variations. MPPT using fuzzy logic shows superior performance and more reliable control to the other methods for this application.

Keywords: photovoltaic system, MPPT, perturb and observe (P&O), incremental conductance (INC), Fuzzy Logic (FLC)

Procedia PDF Downloads 405
2273 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 75
2272 Omni-Modeler: Dynamic Learning for Pedestrian Redetection

Authors: Michael Karnes, Alper Yilmaz

Abstract:

This paper presents the application of the omni-modeler towards pedestrian redetection. The pedestrian redetection task creates several challenges when applying deep neural networks (DNN) due to the variety of pedestrian appearance with camera position, the variety of environmental conditions, and the specificity required to recognize one pedestrian from another. DNNs require significant training sets and are not easily adapted for changes in class appearances or changes in the set of classes held in its knowledge domain. Pedestrian redetection requires an algorithm that can actively manage its knowledge domain as individuals move in and out of the scene, as well as learn individual appearances from a few frames of a video. The Omni-Modeler is a dynamically learning few-shot visual recognition algorithm developed for tasks with limited training data availability. The Omni-Modeler adapts the knowledge domain of pre-trained deep neural networks to novel concepts with a calculated localized language encoder. The Omni-Modeler knowledge domain is generated by creating a dynamic dictionary of concept definitions, which are directly updatable as new information becomes available. Query images are identified through nearest neighbor comparison to the learned object definitions. The study presented in this paper evaluates its performance in re-identifying individuals as they move through a scene in both single-camera and multi-camera tracking applications. The results demonstrate that the Omni-Modeler shows potential for across-camera view pedestrian redetection and is highly effective for single-camera redetection with a 93% accuracy across 30 individuals using 64 example images for each individual.

Keywords: dynamic learning, few-shot learning, pedestrian redetection, visual recognition

Procedia PDF Downloads 72
2271 Natural Interaction Game-Based Learning of Elasticity with Kinect

Authors: Maryam Savari, Mohamad Nizam Ayub, Ainuddin Wahid Abdul Wahab

Abstract:

Game-based Learning (GBL) is an alternative that provides learners with an opportunity to experience a volatile environment in a safe and secure place. A volatile environment requires a different technique to facilitate learning and prevent injury and other hazards. Subjects involving elasticity are always considered hazardous and can cause injuries,for instance a bouncing ball. Elasticity is a topic that necessitates hands-on practicality for learners to experience the effects of elastic objects. In this paper the scope is to investigate the natural interaction between learners and elastic objects in a safe environment using GBL. During interaction, the potentials of natural contact in the process of learning were explored and gestures exhibited during the learning process were identified. GBL was developed using Kinect technology to teach elasticity to primary school children aged 7 to 12. The system detects body gestures and defines the meanings of motions exhibited during the learning process. The qualitative approach was deployed to constantly monitor the interaction between the student and the system. Based on the results, it was found that Natural Interaction GBL (Ni-GBL) is engaging for students to learn, making their learning experience more active and joyful.

Keywords: elasticity, Game-Based Learning (GBL), kinect technology, natural interaction

Procedia PDF Downloads 481
2270 Dynamic Programming Based Algorithm for the Unit Commitment of the Transmission-Constrained Multi-Site Combined Heat and Power System

Authors: A. Rong, P. B. Luh, R. Lahdelma

Abstract:

High penetration of intermittent renewable energy sources (RES) such as solar power and wind power into the energy system has caused temporal and spatial imbalance between electric power supply and demand for some countries and regions. This brings about the critical need for coordinating power production and power exchange for different regions. As compared with the power-only systems, the combined heat and power (CHP) systems can provide additional flexibility of utilizing RES by exploiting the interdependence of power and heat production in the CHP plant. In the CHP system, power production can be influenced by adjusting heat production level and electric power can be used to satisfy heat demand by electric boiler or heat pump in conjunction with heat storage, which is much cheaper than electric storage. This paper addresses multi-site CHP systems without considering RES, which lay foundation for handling penetration of RES. The problem under study is the unit commitment (UC) of the transmission-constrained multi-site CHP systems. We solve the problem by combining linear relaxation of ON/OFF states and sequential dynamic programming (DP) techniques, where relaxed states are used to reduce the dimension of the UC problem and DP for improving the solution quality. Numerical results for daily scheduling with realistic models and data show that DP-based algorithm is from a few to a few hundred times faster than CPLEX (standard commercial optimization software) with good solution accuracy (less than 1% relative gap from the optimal solution on the average).

Keywords: dynamic programming, multi-site combined heat and power system, relaxed states, transmission-constrained generation unit commitment

Procedia PDF Downloads 359
2269 Finding Optimal Operation Condition in a Biological Nutrient Removal Process with Balancing Effluent Quality, Economic Cost and GHG Emissions

Authors: Seungchul Lee, Minjeong Kim, Iman Janghorban Esfahani, Jeong Tai Kim, ChangKyoo Yoo

Abstract:

It is hard to maintain the effluent quality of the wastewater treatment plants (WWTPs) under with fixed types of operational control because of continuously changed influent flow rate and pollutant load. The aims of this study is development of multi-loop multi-objective control (ML-MOC) strategy in plant-wide scope targeting four objectives: 1) maximization of nutrient removal efficiency, 2) minimization of operational cost, 3) maximization of CH4 production in anaerobic digestion (AD) for CH4 reuse as a heat source and energy source, and 4) minimization of N2O gas emission to cope with global warming. First, benchmark simulation mode is modified to describe N2O dynamic in biological process, namely benchmark simulation model for greenhouse gases (BSM2G). Then, three types of single-loop proportional-integral (PI) controllers for DO controller, NO3 controller, and CH4 controller are implemented. Their optimal set-points of the controllers are found by using multi-objective genetic algorithm (MOGA). Finally, multi loop-MOC in BSM2G is implemented and evaluated in BSM2G. Compared with the reference case, the ML-MOC with the optimal set-points showed best control performances than references with improved performances of 34%, 5% and 79% of effluent quality, CH4 productivity, and N2O emission respectively, with the decrease of 65% in operational cost.

Keywords: Benchmark simulation model for greenhouse gas, multi-loop multi-objective controller, multi-objective genetic algorithm, wastewater treatment plant

Procedia PDF Downloads 497
2268 1-D Convolutional Neural Network Approach for Wheel Flat Detection for Freight Wagons

Authors: Dachuan Shi, M. Hecht, Y. Ye

Abstract:

With the trend of digitalization in railway freight transport, a large number of freight wagons in Germany have been equipped with telematics devices, commonly placed on the wagon body. A telematics device contains a GPS module for tracking and a 3-axis accelerometer for shock detection. Besides these basic functions, it is desired to use the integrated accelerometer for condition monitoring without any additional sensors. Wheel flats as a common type of failure on wheel tread cause large impacts on wagons and infrastructure as well as impulsive noise. A large wheel flat may even cause safety issues such as derailments. In this sense, this paper proposes a machine learning approach for wheel flat detection by using car body accelerations. Due to suspension systems, impulsive signals caused by wheel flats are damped significantly and thus could be buried in signal noise and disturbances. Therefore, it is very challenging to detect wheel flats using car body accelerations. The proposed algorithm considers the envelope spectrum of car body accelerations to eliminate the effect of noise and disturbances. Subsequently, a 1-D convolutional neural network (CNN), which is well known as a deep learning method, is constructed to automatically extract features in the envelope-frequency domain and conduct classification. The constructed CNN is trained and tested on field test data, which are measured on the underframe of a tank wagon with a wheel flat of 20 mm length in the operational condition. The test results demonstrate the good performance of the proposed algorithm for real-time fault detection.

Keywords: fault detection, wheel flat, convolutional neural network, machine learning

Procedia PDF Downloads 127
2267 Design of Two-Channel Quadrature Mirror Filter Banks Using a Transformation Approach

Authors: Ju-Hong Lee, Yi-Lin Shieh

Abstract:

Two-dimensional (2-D) quadrature mirror filter (QMF) banks have been widely considered for high-quality coding of image and video data at low bit rates. Without implementing subband coding, a 2-D QMF bank is required to have an exactly linear-phase response without magnitude distortion, i.e., the perfect reconstruction (PR) characteristics. The design problem of 2-D QMF banks with the PR characteristics has been considered in the literature for many years. This paper presents a transformation approach for designing 2-D two-channel QMF banks. Under a suitable one-dimensional (1-D) to two-dimensional (2-D) transformation with a specified decimation/interpolation matrix, the analysis and synthesis filters of the QMF bank are composed of 1-D causal and stable digital allpass filters (DAFs) and possess the 2-D doubly complementary half-band (DC-HB) property. This facilitates the design problem of the two-channel QMF banks by finding the real coefficients of the 1-D recursive DAFs. The design problem is formulated based on the minimax phase approximation for the 1-D DAFs. A novel objective function is then derived to obtain an optimization for 1-D minimax phase approximation. As a result, the problem of minimizing the objective function can be simply solved by using the well-known weighted least-squares (WLS) algorithm in the minimax (L∞) optimal sense. The novelty of the proposed design method is that the design procedure is very simple and the designed 2-D QMF bank achieves perfect magnitude response and possesses satisfactory phase response. Simulation results show that the proposed design method provides much better design performance and much less design complexity as compared with the existing techniques.

Keywords: Quincunx QMF bank, doubly complementary filter, digital allpass filter, WLS algorithm

Procedia PDF Downloads 221
2266 Internet Optimization by Negotiating Traffic Times

Authors: Carlos Gonzalez

Abstract:

This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.

Keywords: internet optimization, video download, future demands, secure storage

Procedia PDF Downloads 134
2265 The Fusion of Blockchain and AI in Supply Chain Finance: Scalability in Distributed Systems

Authors: Wu You, Burra Venkata Durga Kumar

Abstract:

This study examines the promising potential of integrating Blockchain and Artificial Intelligence (AI) technologies to scalability in Distributed Systems within the field of supply chain finance. The finance industry is continually confronted with scalability challenges in its Distributed Systems, particularly within the supply chain finance sector, impacting efficiency and security. Blockchain, with its inherent attributes of high scalability and secure distributed ledger system, coupled with AI's strengths in optimizing data processing and decision-making, holds the key to innovating the industry's approach to these issues. This study elucidates the synergistic interplay between Blockchain and AI, detailing how their fusion can drive a significant transformation in the supply chain finance sector's Distributed Systems. It offers specific use-cases within this field to illustrate the practical implications and potential benefits of this technological convergence. The study also discusses future possibilities and current challenges in implementing this groundbreaking approach within the context of supply chain finance. It concludes that the intersection of Blockchain and AI could ignite a new epoch of enhanced efficiency, security, and transparency in the Distributed Systems of supply chain finance within the financial industry.

Keywords: blockchain, artificial intelligence (AI), scaled distributed systems, supply chain finance, efficiency and security

Procedia PDF Downloads 85
2264 Experimental Investigation on Effects of Carrier Solvent and Oxide Fluxes in Activated TIG Welding of Reduced Activation Ferritic/Martensitic Steel

Authors: Jay J. Vora, Vishvesh J. Badheka

Abstract:

This work attempts to investigate the effect of oxide fluxes on 6mm thick Reduced Activation ferritic/martensitic steels (RAFM) during Activated TIG (A-TIG) welding. Six different fluxes Al₂O₃, Co₃O₄, CuO, HgO, MoO₃, and NiO were mixed with methanol for conversion into paste and bead-on-plate experiments were then carried out. This study, systematically investigates the influence of oxide-based flux powder and carrier solvent composition on the weld bead shape, geometric shape of weld bead and dominant depth enhancing mechanism in tungsten inert gas (TIG) welding of reduced activation ferritic/martensitic (RAFM) steel. It was inferred from the study that flux Co₃O₄ and MoO₃ imparted full and secure (more than 6mm) penetration with methanol owing to dual mechanism of reversed Marangoni and arc construction. The use of methanol imparted good spreadabilty and coverability and ultimately higher peak temperatures were observed with its use owing to stronger depth enhancing mechanisms than use of acetone with same oxide fluxes and welding conditions.

Keywords: A-TIG, flux, oxides, penetration, RAFM, temperature, welding

Procedia PDF Downloads 204
2263 Pareto System of Optimal Placement and Sizing of Distributed Generation in Radial Distribution Networks Using Particle Swarm Optimization

Authors: Sani M. Lawal, Idris Musa, Aliyu D. Usman

Abstract:

The Pareto approach of optimal solutions in a search space that evolved in multi-objective optimization problems is adopted in this paper, which stands for a set of solutions in the search space. This paper aims at presenting an optimal placement of Distributed Generation (DG) in radial distribution networks with an optimal size for minimization of power loss and voltage deviation as well as maximizing voltage profile of the networks. And these problems are formulated using particle swarm optimization (PSO) as a constraint nonlinear optimization problem with both locations and sizes of DG being continuous. The objective functions adopted are the total active power loss function and voltage deviation function. The multiple nature of the problem, made it necessary to form a multi-objective function in search of the solution that consists of both the DG location and size. The proposed PSO algorithm is used to determine optimal placement and size of DG in a distribution network. The output indicates that PSO algorithm technique shows an edge over other types of search methods due to its effectiveness and computational efficiency. The proposed method is tested on the standard IEEE 34-bus and validated with 33-bus test systems distribution networks. Results indicate that the sizing and location of DG are system dependent and should be optimally selected before installing the distributed generators in the system and also an improvement in the voltage profile and power loss reduction have been achieved.

Keywords: distributed generation, pareto, particle swarm optimization, power loss, voltage deviation

Procedia PDF Downloads 360
2262 Historical Analysis of Nigeria Politics, 1960–2010

Authors: Abdulsalami Muyideen Deji

Abstract:

Nigeria as nation got independence in 1960 from British government which allowed indigenous people to form self-government and rule themselves base on the acceptable laws and orders provided by indigenes. All citizens saw it as a welcome development that gave them opportunity to develop at their own pace. Certainly, this occurred at the first instance up to the first republic of 1963. But things became worse for the country when the first military coup of January 15, 1966 sowed apple of discord between the three major tribes in Nigeria Hausa, Yoruba and Igbo as a result of miscarriage of well-conceived plan of master-minder of that coup Major Chukwuma Kaduna Nzeogwu. Although, the argument had emanated from different quarters that if Nigeria was given opportunity to develop at the pace it was going at that time probably the Nigeria would have been among developed nation today, but that ill-fated coup was a clog in the wheel of nation’s progress. The base of this argument is that Nigeria achievements after independence still depend on the work of leaders who secure independence and also directed the affairs of nation within that short period of time up till today. Since then Nigeria has been grasping with different system of government, yet, the nation is still far from the solution. This paper will analyze Nigeria politics from independence, offer suggestion on the way forward. The source is strictly base on secondary source from textbook, newspapers, internet and journals.

Keywords: politics, government, independence, development

Procedia PDF Downloads 318
2261 Numerical Iteration Method to Find New Formulas for Nonlinear Equations

Authors: Kholod Mohammad Abualnaja

Abstract:

A new algorithm is presented to find some new iterative methods for solving nonlinear equations F(x)=0 by using the variational iteration method. The efficiency of the considered method is illustrated by example. The results show that the proposed iteration technique, without linearization or small perturbation, is very effective and convenient.

Keywords: variational iteration method, nonlinear equations, Lagrange multiplier, algorithms

Procedia PDF Downloads 533
2260 Fast Switching Mechanism for Multicasting Failure in OpenFlow Networks

Authors: Alaa Allakany, Koji Okamura

Abstract:

Multicast technology is an efficient and scalable technology for data distribution in order to optimize network resources. However, in the IP network, the responsibility for management of multicast groups is distributed among network routers, which causes some limitations such as delays in processing group events, high bandwidth consumption and redundant tree calculation. Software Defined Networking (SDN) represented by OpenFlow presented as a solution for many problems, in SDN the control plane and data plane are separated by shifting the control and management to a remote centralized controller, and the routers are used as a forwarder only. In this paper we will proposed fast switching mechanism for solving the problem of link failure in multicast tree based on Tabu Search heuristic algorithm and modifying the functions of OpenFlow switch to fasts switch to the pack up sub tree rather than sending to the controller. In this work we will implement multicasting OpenFlow controller, this centralized controller is a core part in our multicasting approach, which is responsible for 1- constructing the multicast tree, 2- handling the multicast group events and multicast state maintenance. And finally modifying OpenFlow switch functions for fasts switch to pack up paths. Forwarders, forward the multicast packet based on multicast routing entries which were generated by the centralized controller. Tabu search will be used as heuristic algorithm for construction near optimum multicast tree and maintain multicast tree to still near optimum in case of join or leave any members from multicast group (group events).

Keywords: multicast tree, software define networks, tabu search, OpenFlow

Procedia PDF Downloads 258
2259 Lean Airport Infrastructure Development: A Sustainable Solution for Integration of Remote Regions

Authors: Joeri N. Aulman

Abstract:

In the remote Indian region of Gulbarga a case study of lean airport infrastructure development is getting ‘cast in stone’; In April the first turbo-props will land, and the optimized terminal building will process its first passengers, using minimal square meters in a facility that is based on a complete dress-down of the core operational processes. Yet the solution that resulted from this case study has such elegance in its simplicity that it has emboldened the local administration to invest in its construction and thus secure this remote region’s connectivity to India’s growth story. This paper aims to provide further background to the Gulbarga case study and its relevance to remote region connectivity, covering the demand that was identified, its practical application and its regulatory context and relevance for today’s airport manager and local administrators. This embodies the scope of the paper. In summary, the paper will give airport managers and regional authorities an overview and background to innovative case studies of lean airport infrastructure developments which combine both optimized CAPEX and running costs/OPEX without losing sight of the aspirational nature of up and coming remote regions; a truly sustainable model.

Keywords: airport, CAPEX, lean, sustainable, air connectivity, remote regions

Procedia PDF Downloads 306
2258 Development of a Sequential Multimodal Biometric System for Web-Based Physical Access Control into a Security Safe

Authors: Babatunde Olumide Olawale, Oyebode Olumide Oyediran

Abstract:

The security safe is a place or building where classified document and precious items are kept. To prevent unauthorised persons from gaining access to this safe a lot of technologies had been used. But frequent reports of an unauthorised person gaining access into security safes with the aim of removing document and items from the safes are pointers to the fact that there is still security gap in the recent technologies used as access control for the security safe. In this paper we try to solve this problem by developing a multimodal biometric system for physical access control into a security safe using face and voice recognition. The safe is accessed by the combination of face and speech pattern recognition and also in that sequential order. User authentication is achieved through the use of camera/sensor unit and a microphone unit both attached to the door of the safe. The user face was captured by the camera/sensor while the speech was captured by the use of the microphone unit. The Scale Invariance Feature Transform (SIFT) algorithm was used to train images to form templates for the face recognition system while the Mel-Frequency Cepitral Coefficients (MFCC) algorithm was used to train the speech recognition system to recognise authorise user’s speech. Both algorithms were hosted in two separate web based servers and for automatic analysis of our work; our developed system was simulated in a MATLAB environment. The results obtained shows that the developed system was able to give access to authorise users while declining unauthorised person access to the security safe.

Keywords: access control, multimodal biometrics, pattern recognition, security safe

Procedia PDF Downloads 324
2257 Information Technology and Occupational Safety and Health

Authors: Muhammad Afiq Anaqi Bin Baharudin, Muhammad Izamuddin Bin Mohd Nasir, Syarifuddin Bin Sujuanda, Muhammad Syahmi Rusyaidi Bin Sham Suddin, Danish Hakimi Bin Kamaruzaman, Muhammad Haqimi Nazim Bin Hasmanizam, Mohammad Akmal Zakwan Bin Amran, Muhammad Alparizi Bin Latif

Abstract:

By improving efficiency and production, information technology (IT) has transformed working environments, but it has also created new threats to occupational safety and health (OSH). This study evaluates the literature that has already been written on the subject of IT and OSH, identifies major findings and discussion points, and highlights gaps in the material that call for additional research. The study's findings, which look at how IT affects OSH in a sizable multinational organization, are also presented in the report. According to the report, IT poses a number of OSH problems, such as ergonomic dangers, eye strain, dangers related to cybersecurity, and psychological hazards. The report suggests using tactics like providing comfortable workstations, encouraging a healthy balance between work and life, and putting strong cybersecurity safeguards in place to reduce these dangers. The implications of these findings for OSH and IT are discussed in the paper's conclusion, and it emphasizes the need for more study and action to address these dangers and promote healthy and secure working environments in the age of digitization.

Keywords: information technology, occupational safety and health (OSH), ergonomic, hazards, workplace.

Procedia PDF Downloads 122
2256 User Authentication Using Graphical Password with Sound Signature

Authors: Devi Srinivas, K. Sindhuja

Abstract:

This paper presents architecture to improve surveillance applications based on the usage of the service oriented paradigm, with smart phones as user terminals, allowing application dynamic composition and increasing the flexibility of the system. According to the result of moving object detection research on video sequences, the movement of the people is tracked using video surveillance. The moving object is identified using the image subtraction method. The background image is subtracted from the foreground image, from that the moving object is derived. So the Background subtraction algorithm and the threshold value is calculated to find the moving image by using background subtraction algorithm the moving frame is identified. Then, by the threshold value the movement of the frame is identified and tracked. Hence, the movement of the object is identified accurately. This paper deals with low-cost intelligent mobile phone-based wireless video surveillance solution using moving object recognition technology. The proposed solution can be useful in various security systems and environmental surveillance. The fundamental rule of moving object detecting is given in the paper, then, a self-adaptive background representation that can update automatically and timely to adapt to the slow and slight changes of normal surroundings is detailed. While the subtraction of the present captured image and the background reaches a certain threshold, a moving object is measured to be in the current view, and the mobile phone will automatically notify the central control unit or the user through SMS (Short Message System). The main advantage of this system is when an unknown image is captured by the system it will alert the user automatically by sending an SMS to user’s mobile.

Keywords: security, graphical password, persuasive cued click points

Procedia PDF Downloads 529
2255 The Impact of Introspective Models on Software Engineering

Authors: Rajneekant Bachan, Dhanush Vijay

Abstract:

The visualization of operating systems has refined the Turing machine, and current trends suggest that the emulation of 32 bit architectures will soon emerge. After years of technical research into Web services, we demonstrate the synthesis of gigabit switches, which embodies the robust principles of theory. Loam, our new algorithm for forward-error correction, is the solution to all of these challenges.

Keywords: software engineering, architectures, introspective models, operating systems

Procedia PDF Downloads 531
2254 Automated Parking System

Authors: N. Arunraj, C. P. V. Paul, D. M. D. Jayawardena, W. N. D. Fernando

Abstract:

Traffic congestion with increased numbers of vehicles is already a serious issue for many countries. The absence of sufficient parking spaces adds to the issue. Motorists are forced to wait in long queues to park their vehicles. This adds to the inconvenience faced by a motorist, kept waiting for a slot allocation, manually done along with the parking payment calculation. In Sri Lanka, nowadays, parking systems use barcode technology to identify the vehicles at both the entrance and the exit points. Customer management is handled by the use of man power. A parking space is, generally permanently sub divided according to the vehicle type. Here, again, is an issue. Parking spaces are not utilized to the maximum. The current arrangement leaves room for unutilized parking spaces. Accordingly, there is a need to manage the parking space dynamically. As a vehicle enters the parking area, available space has to be assigned for the vehicle according to the vehicle type. The system, Automated Parking System (APS), provides an automated solution using RFID Technology to identify the vehicles. Simultaneously, an algorithm manages the space allocation dynamically. With this system, there is no permanent parking slot allocation for a vehicle type. A desktop application manages the customer. A Web application is used to manage the external users with their reservations. The system also has an android application to view the nearest parking area from the current location. APS is built using java and php. It uses LED panels to guide the user inside the parking area to find the allocated parking slot accurately. The system ensures efficient performance, saving precious time for a customer. Compared with the current parking systems, APS interacts with users and increases customer satisfaction as well.

Keywords: RFID, android, web based system, barcode, algorithm, LED panels

Procedia PDF Downloads 595
2253 Statistical Randomness Testing of Some Second Round Candidate Algorithms of CAESAR Competition

Authors: Fatih Sulak, Betül A. Özdemir, Beyza Bozdemir

Abstract:

In order to improve symmetric key research, several competitions had been arranged by organizations like National Institute of Standards and Technology (NIST) and International Association for Cryptologic Research (IACR). In recent years, the importance of authenticated encryption has rapidly increased because of the necessity of simultaneously enabling integrity, confidentiality and authenticity. Therefore, at January 2013, IACR announced the Competition for Authenticated Encryption: Security, Applicability, and Robustness (CAESAR Competition) which will select secure and efficient algorithms for authenticated encryption. Cryptographic algorithms are anticipated to behave like random mappings; hence, it is important to apply statistical randomness tests to the outputs of the algorithms. In this work, the statistical randomness tests in the NIST Test Suite and the other recently designed randomness tests are applied to six second round algorithms of the CAESAR Competition. It is observed that AEGIS achieves randomness after 3 rounds, Ascon permutation function achieves randomness after 1 round, Joltik encryption function achieves randomness after 9 rounds, Morus state update function achieves randomness after 3 rounds, Pi-cipher achieves randomness after 1 round, and Tiaoxin achieves randomness after 1 round.

Keywords: authenticated encryption, CAESAR competition, NIST test suite, statistical randomness tests

Procedia PDF Downloads 312
2252 Machine Learning in Patent Law: How Genetic Breeding Algorithms Challenge Modern Patent Law Regimes

Authors: Stefan Papastefanou

Abstract:

Artificial intelligence (AI) is an interdisciplinary field of computer science with the aim of creating intelligent machine behavior. Early approaches to AI have been configured to operate in very constrained environments where the behavior of the AI system was previously determined by formal rules. Knowledge was presented as a set of rules that allowed the AI system to determine the results for specific problems; as a structure of if-else rules that could be traversed to find a solution to a particular problem or question. However, such rule-based systems typically have not been able to generalize beyond the knowledge provided. All over the world and especially in IT-heavy industries such as the United States, the European Union, Singapore, and China, machine learning has developed to be an immense asset, and its applications are becoming more and more significant. It has to be examined how such products of machine learning models can and should be protected by IP law and for the purpose of this paper patent law specifically, since it is the IP law regime closest to technical inventions and computing methods in technical applications. Genetic breeding models are currently less popular than recursive neural network method and deep learning, but this approach can be more easily described by referring to the evolution of natural organisms, and with increasing computational power; the genetic breeding method as a subset of the evolutionary algorithms models is expected to be regaining popularity. The research method focuses on patentability (according to the world’s most significant patent law regimes such as China, Singapore, the European Union, and the United States) of AI inventions and machine learning. Questions of the technical nature of the problem to be solved, the inventive step as such, and the question of the state of the art and the associated obviousness of the solution arise in the current patenting processes. Most importantly, and the key focus of this paper is the problem of patenting inventions that themselves are developed through machine learning. The inventor of a patent application must be a natural person or a group of persons according to the current legal situation in most patent law regimes. In order to be considered an 'inventor', a person must actually have developed part of the inventive concept. The mere application of machine learning or an AI algorithm to a particular problem should not be construed as the algorithm that contributes to a part of the inventive concept. However, when machine learning or the AI algorithm has contributed to a part of the inventive concept, there is currently a lack of clarity regarding the ownership of artificially created inventions. Since not only all European patent law regimes but also the Chinese and Singaporean patent law approaches include identical terms, this paper ultimately offers a comparative analysis of the most relevant patent law regimes.

Keywords: algorithms, inventor, genetic breeding models, machine learning, patentability

Procedia PDF Downloads 105
2251 Evaluating the Impact of Expansion on Urban Thermal Surroundings: A Case Study of Lahore Metropolitan City, Pakistan

Authors: Usman Ahmed Khan

Abstract:

Urbanization directly affects the existing infrastructure, landscape modification, environmental contamination, and traffic pollution, especially if there is a lack of urban planning. Recently, the rapid urban sprawl has resulted in less developed green areas and has devastating environmental consequences. This study was aimed to study the past urban expansion rates and measure LST from satellite data. The land use land cover (LULC) maps of years 1996, 2010, 2013, and 2017 were generated using landsat satellite images. Four main classes, i.e., water, urban, bare land, and vegetation, were identified using unsupervised classification with iterative self-organizing data analysis (isodata) technique. The LST from satellite thermal data can be derived from different procedures: atmospheric, radiometric calibrations and surface emissivity corrections, classification of spatial changeability in land-cover. Different methods and formulas were used in the algorithm that successfully retrieves the land surface temperature to help us study the thermal environment of the ground surface. To verify the algorithm, the land surface temperature and the near-air temperature were compared. The results showed that, From 1996-2017, urban areas increased to about a considerable increase of about 48%. Few areas of the city also shown in a reduction in LST from the year 1996-2017 that actually began their transitional phase from rural to urban LULC. The mean temperature of the city increased averagely about 1ºC each year in the month of October. The green and vegetative areas witnessed a decrease in the area while a higher number of pixels increased in urban class.

Keywords: LST, LULC, isodata, urbanization

Procedia PDF Downloads 99
2250 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data

Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello

Abstract:

Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.

Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification

Procedia PDF Downloads 875