Search results for: security analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29092

Search results for: security analysis

26992 Stakeholder Engagement to Address Urban Health Systems Gaps for Migrants

Authors: A. Chandra, M. Arthur, L. Mize, A. Pomeroy-Stevens

Abstract:

Background: Lower and middle-income countries (LMICs) in Asia face rapid urbanization resulting in both economic opportunities (the urban advantage) and emerging health challenges. Urban health risks are magnified in informal settlements and include infectious disease outbreaks, inadequate access to health services, and poor air quality. Over the coming years, urban spaces in Asia will face accelerating public health risks related to migration, climate change, and environmental health. These challenges are complex and require multi-sectoral and multi-stakeholder solutions. The Building Health Cities (BHC) program is funded by the United States Agency for International Development (USAID) to work with smart city initiatives in the Asia region. BHC approaches urban health challenges by addressing policies, planning, and services through a health equity lens, with a particular focus on informal settlements and migrant communities. The program works to develop data-driven decision-making, build inclusivity through stakeholder engagement, and facilitate the uptake of appropriate technology. Methodology: The BHC program has partnered with the smart city initiatives of Indore in India, Makassar in Indonesia, and Da Nang in Vietnam. Implementing partners support municipalities to improve health delivery and equity using two key approaches: political economy analysis and participatory systems mapping. Political economy analyses evaluate barriers to collective action, including corruption, security, accountability, and incentives. Systems mapping evaluates community health challenges using a cross-sectoral approach, analyzing the impact of economic, environmental, transport, security, health system, and built environment factors. The mapping exercise draws on the experience and expertise of a diverse cohort of stakeholders, including government officials, municipal service providers, and civil society organizations. Results: Systems mapping and political economy analyses identified significant barriers for health care in migrant populations. In Makassar, migrants are unable to obtain the necessary card that entitles them to subsidized health services. This finding is being used to engage with municipal governments to mitigate the barriers that limit migrant enrollment in the public social health insurance scheme. In Indore, the project identified poor drainage of storm and wastewater in migrant settlements as a cause of poor health. Unsafe and inadequate infrastructure placed residents of these settlements at risk for both waterborne diseases and injuries. The program also evaluated the capacity of urban primary health centers serving migrant communities, identifying challenges related to their hours of service and shortages of health workers. In Da Nang, the systems mapping process has only recently begun, with the formal partnership launched in December 2019. Conclusion: This paper explores lessons learned from BHC’s systems mapping, political economy analyses, and stakeholder engagement approaches. The paper shares progress related to the health of migrants in informal settlements. Case studies feature barriers identified and mitigating steps, including governance actions, taken by local stakeholders in partner cities. The paper includes an update on ongoing progress from Indore and Makassar and experience from the first six months of program implementation from Da Nang.

Keywords: informal settlements, migration, stakeholder engagement mapping, urban health

Procedia PDF Downloads 104
26991 A Review Paper for Detecting Zero-Day Vulnerabilities

Authors: Tshegofatso Rambau, Tonderai Muchenje

Abstract:

Zero-day attacks (ZDA) are increasing day by day; there are many vulnerabilities in systems and software that date back decades. Companies keep discovering vulnerabilities in their systems and software and work to release patches and updates. A zero-day vulnerability is a software fault that is not widely known and is unknown to the vendor; attackers work very quickly to exploit these vulnerabilities. These are major security threats with a high success rate because businesses lack the essential safeguards to detect and prevent them. This study focuses on the factors and techniques that can help us detect zero-day attacks. There are various methods and techniques for detecting vulnerabilities. Various companies like edges can offer penetration testing and smart vulnerability management solutions. We will undertake literature studies on zero-day attacks and detection methods, as well as modeling approaches and simulations, as part of the study process.

Keywords: zero-day attacks, exploitation, vulnerabilities

Procedia PDF Downloads 87
26990 A Social Network Analysis of the Palestinian Feminist Network Tal3at

Authors: Maath M. Musleh

Abstract:

This research aims to study recent trends in the Palestinian feminist movement through the case study of Tal3at. The study uses social network analysis as its primary method to analyze Twitter data. It attempts to interpret results through the lens of network theories and Parson’s AGIL paradigm. The study reveals major structural weaknesses in the Tal3at network. Our findings suggest that the movement will decline soon as sentiments of alienation amongst Palestinian women increases. These findings were validated by a couple of central actors in the network. This study contributes an SNA approach to the understanding of the understudied Palestinian feminism.

Keywords: feminism, Palestine, social network analysis, Tal3at

Procedia PDF Downloads 251
26989 The Implications of Some Social Variables in Increasing the Unemployed in Egypt

Authors: Mohamed Elkhouli

Abstract:

This research sets out to identify some social factors or variables that may need to be controlled in order to decrease the volume of unemployed in Egypt. As well as, it comes to investigate the relationship between a set of social variables and unemployment issue in Egypt in the sake of determining the most important social variables influencing the rise of unemployed during the time series targeted (2002-2012). Highlighting the unemployment issue is becoming an increasingly important topic in all countries throughout the world resulting from expand their globalization efforts. In general, the study tries to determine what the most social priorities are likely to adopt seriously by the Egypt's government in order to solve the unemployed problem. The results showed that the low value for both of small projects and the total value of disbursed social security respectively have significant impact on increasing the No. of unemployed in Egypt, according to the target period by the current study.

Keywords: Egypt, social status, unemployment, unemployed

Procedia PDF Downloads 310
26988 Vibration Propagation in Body-in-White Structures Through Structural Intensity Analysis

Authors: Jamal Takhchi

Abstract:

The understanding of vibration propagation in complex structures such as automotive body in white remains a challenging issue in car design regarding NVH performances. The current analysis is limited to the low frequency range where modal concepts are dominant. Higher frequencies, between 200 and 1000 Hz, will become critical With the rise of electrification. EVs annoying sounds are mostly whines created by either Gears or e-motors between 300 Hz and 2 kHz. Structural intensity analysis was Experienced a few years ago on finite element models. The application was promising but limited by the fact that the propagating 3D intensity vector field is masked by a rotational Intensity field. This rotational field should be filtered using a differential operator. The expression of this operator in the framework of finite element modeling is not yet known. The aim of the proposed work is to implement this operator in the current dynamic solver (NASTRAN) of Stellantis and develop the Expected methodology for the mid-frequency structural analysis of electrified vehicles.

Keywords: structural intensity, NVH, body in white, irrotatational intensity

Procedia PDF Downloads 141
26987 Response Analysis of a Steel Reinforced Concrete High-Rise Building during the 2011 Tohoku Earthquake

Authors: Naohiro Nakamura, Takuya Kinoshita, Hiroshi Fukuyama

Abstract:

The 2011 off The Pacific Coast of Tohoku Earthquake caused considerable damage to wide areas of eastern Japan. A large number of earthquake observation records were obtained at various places. To design more earthquake-resistant buildings and improve earthquake disaster prevention, it is necessary to utilize these data to analyze and evaluate the behavior of a building during an earthquake. This paper presents an earthquake response simulation analysis (hereafter a seismic response analysis) that was conducted using data recorded during the main earthquake (hereafter the main shock) as well as the earthquakes before and after it. The data were obtained at a high-rise steel-reinforced concrete (SRC) building in the bay area of Tokyo. We first give an overview of the building, along with the characteristics of the earthquake motion and the building during the main shock. The data indicate that there was a change in the natural period before and after the earthquake. Next, we present the results of our seismic response analysis. First, the analysis model and conditions are shown, and then, the analysis result is compared with the observational records. Using the analysis result, we then study the effect of soil-structure interaction on the response of the building. By identifying the characteristics of the building during the earthquake (i.e., the 1st natural period and the 1st damping ratio) by the Auto-Regressive eXogenous (ARX) model, we compare the analysis result with the observational records so as to evaluate the accuracy of the response analysis. In this study, a lumped-mass system SR model was used to conduct a seismic response analysis using observational data as input waves. The main results of this study are as follows: 1) The observational records of the 3/11 main shock put it between a level 1 and level 2 earthquake. The result of the ground response analysis showed that the maximum shear strain in the ground was about 0.1% and that the possibility of liquefaction occurring was low. 2) During the 3/11 main shock, the observed wave showed that the eigenperiod of the building became longer; this behavior could be generally reproduced in the response analysis. This prolonged eigenperiod was due to the nonlinearity of the superstructure, and the effect of the nonlinearity of the ground seems to have been small. 3) As for the 4/11 aftershock, a continuous analysis in which the subject seismic wave was input after the 3/11 main shock was input was conducted. The analyzed values generally corresponded well with the observed values. This means that the effect of the nonlinearity of the main shock was retained by the building. It is important to consider this when conducting the response evaluation. 4) The first period and the damping ratio during a vibration were evaluated by an ARX model. Our results show that the response analysis model in this study is generally good at estimating a change in the response of the building during a vibration.

Keywords: ARX model, response analysis, SRC building, the 2011 off the Pacific Coast of Tohoku Earthquake

Procedia PDF Downloads 153
26986 Rescaled Range Analysis of Seismic Time-Series: Example of the Recent Seismic Crisis of Alhoceima

Authors: Marina Benito-Parejo, Raul Perez-Lopez, Miguel Herraiz, Carolina Guardiola-Albert, Cesar Martinez

Abstract:

Persistency, long-term memory and randomness are intrinsic properties of time-series of earthquakes. The Rescaled Range Analysis (RS-Analysis) was introduced by Hurst in 1956 and modified by Mandelbrot and Wallis in 1964. This method represents a simple and elegant analysis which determines the range of variation of one natural property (the seismic energy released in this case) in a time interval. Despite the simplicity, there is complexity inherent in the property measured. The cumulative curve of the energy released in time is the well-known fractal geometry of a devil’s staircase. This geometry is used for determining the maximum and minimum value of the range, which is normalized by the standard deviation. The rescaled range obtained obeys a power-law with the time, and the exponent is the Hurst value. Depending on this value, time-series can be classified in long-term or short-term memory. Hence, an algorithm has been developed for compiling the RS-Analysis for time series of earthquakes by days. Completeness time distribution and locally stationarity of the time series are required. The interest of this analysis is their application for a complex seismic crisis where different earthquakes take place in clusters in a short period. Therefore, the Hurst exponent has been obtained for the seismic crisis of Alhoceima (Mediterranean Sea) of January-March, 2016, where at least five medium-sized earthquakes were triggered. According to the values obtained from the Hurst exponent for each cluster, a different mechanical origin can be detected, corroborated by the focal mechanisms calculated by the official institutions. Therefore, this type of analysis not only allows an approach to a greater understanding of a seismic series but also makes possible to discern different types of seismic origins.

Keywords: Alhoceima crisis, earthquake time series, Hurst exponent, rescaled range analysis

Procedia PDF Downloads 309
26985 Context-Aware Point-Of-Interests Recommender Systems Using Integrated Sentiment and Network Analysis

Authors: Ho Yeon Park, Kyoung-Jae Kim

Abstract:

Recently, user’s interests for location-based social network service increases according to the advances of social web and location-based technologies. It may be easy to recommend preferred items if we can use user’s preference, context and social network information simultaneously. In this study, we propose context-aware POI (point-of-interests) recommender systems using location-based network analysis and sentiment analysis which consider context, social network information and implicit user’s preference score. We propose a context-aware POI recommendation system consisting of three sub-modules and an integrated recommendation system of them. First, we will develop a recommendation module based on network analysis. This module combines social network analysis and cluster-indexing collaboration filtering. Next, this study develops a recommendation module using social singular value decomposition (SVD) and implicit SVD. In this research, we will develop a recommendation module that can recommend preference scores based on the frequency of POI visits of user in POI recommendation process by using social and implicit SVD which can reflect implicit feedback in collaborative filtering. We also develop a recommendation module using them that can estimate preference scores based on the recommendation. Finally, this study will propose a recommendation module using opinion mining and emotional analysis using data such as reviews of POIs extracted from location-based social networks. Finally, we will develop an integration algorithm that combines the results of the three recommendation modules proposed in this research. Experimental results show the usefulness of the proposed model in relation to the recommended performance.

Keywords: sentiment analysis, network analysis, recommender systems, point-of-interests, business analytics

Procedia PDF Downloads 235
26984 Application of Molecular Markers for Crop Improvement

Authors: Monisha Isaac

Abstract:

Use of molecular markers for selecting plants with desired traits has been started long back. Due to their heritable characteristics, they are useful for identification and characterization of specific genotypes. The study involves various types of molecular markers used to select multiple desired characters in plants, their properties, and advantages to improve crop productivity in adverse climatological conditions for the purpose of providing food security to fast-growing global population. The study shows that genetic similarities obtained from molecular markers provide more accurate information and the genetic diversity can be better estimated from the genetic relationship obtained from the dendrogram. The information obtained from markers assisted characterization is more suitable for the crops of economic importance like sugarcane.

Keywords: molecular markers, crop productivity, genetic diversity, genotype

Procedia PDF Downloads 496
26983 Sensitivity Analysis during the Optimization Process Using Genetic Algorithms

Authors: M. A. Rubio, A. Urquia

Abstract:

Genetic algorithms (GA) are applied to the solution of high-dimensional optimization problems. Additionally, sensitivity analysis (SA) is usually carried out to determine the effect on optimal solutions of changes in parameter values of the objective function. These two analyses (i.e., optimization and sensitivity analysis) are computationally intensive when applied to high-dimensional functions. The approach presented in this paper consists in performing the SA during the GA execution, by statistically analyzing the data obtained of running the GA. The advantage is that in this case SA does not involve making additional evaluations of the objective function and, consequently, this proposed approach requires less computational effort than conducting optimization and SA in two consecutive steps.

Keywords: optimization, sensitivity, genetic algorithms, model calibration

Procedia PDF Downloads 424
26982 ISIS Resurgence in the Era of COVID-19

Authors: Stacey Pollard, Henry Baraket, Girish Ganesan, Natalie Kim

Abstract:

One year after U.S.-led coalition operations liberated ISIS-held territories in Iraq and Syria and killed ISIS core leader Abu Bakr al-Baghdadi, ISIS is resurging. Taking a page from its old playbook, the organization is capitalizing on social unrest and a rapidly deteriorating security environment—exacerbated by the COVID-19 pandemic—to reconstitute in permissive areas of Iraq and Syria. This Short examines ISIS’s pandemic-era ground and information operations through the lens of its state- and nation-making efforts to help analysts and decisionmakers better understand the imminence and scope of the threat. ISIS is rapidly overcoming U.S.-supported counterterrorism gains and, without direct pressure to reverse these advances, is poised for recovery.

Keywords: Terrorism, COVID-19, Islamic State, Instability, Iraq, Syria, Global, Resurgence

Procedia PDF Downloads 58
26981 Human Error Analysis in the USA Marine Accidents Reports

Authors: J. Sánchez-Beaskoetxea

Abstract:

The analysis of accidents, such as marine accidents, is one of the most useful instruments to avoid future accidents. In the case of marine accidents, from a simple collision of a small boat in a port to the wreck of a gigantic tanker ship, the study of the causes of the accidents is the basis of a great part of the marine international legislation. Some countries have official institutions who investigate all the accidents in which a ship with their flag is involved. In the case of the USA, the National Transportation Safety Board (NTSB) is responsible for these researches. The NTSB, after a deep investigation into each accident, publishes a Marine Accident Report with the possible cause of the accident. This paper analyses all the Marine Accident Reports published by the NTBS and focuses its attention especially in the Human Errors that led to reported accidents. In this research, the different Human Errors made by crew members are cataloged in 10 different groups. After a complete analysis of all the reports, the statistical analysis on the Human Errors typology in marine accidents is presented in order to use it as a tool to avoid the same errors in the future.

Keywords: human error, marine accidents, ship crew, USA

Procedia PDF Downloads 405
26980 Human-Induced Vibration and Degree of Human Comfortability Analysis of Intersection Pedestrian Bridge

Authors: Yaowen Sheng, Jiuxian Liu

Abstract:

In order to analyze the pedestrian bridge dynamic characteristics and degree of comfortability, the finite element method and live load time history method is used to calculate the dynamic response of the bridge. The example bridge’s dynamic characteristics and degree of human comfortability need to be analyzed. The project background is a three-way intersection. The intersection has three side blocks. An intersection bridge is designed to help people cross the streets. The finite element model of the bridge is established by the Midas/Civil software, and the analysis of the model is done. The strength, stiffness, and stability checks are also completed. Apart from the static analysis of the bridge, the dynamic analysis of the bridge is also completed to avoid the problems resulted from vibrations. The results show that the pedestrian bridge has different dynamic characteristics compared to other normal bridges. The degree of human comfortability satisfies the requirements of Chinese and British specifications. The live load time history method can be used to calculate the dynamic response of the bridge.

Keywords: pedestrian bridge, steel box girder, human-induced vibration, finite element analysis, degree of human comfortability

Procedia PDF Downloads 145
26979 Axle Load Estimation of Moving Vehicles Using BWIM Technique

Authors: Changgil Lee, Seunghee Park

Abstract:

Although vehicle driving test for the development of BWIM system is necessary, but it needs much cost and time in addition application of various driving condition. Thus, we need the numerical-simulation method resolving the cost and time problems of vehicle driving test and the way of measuring response of bridge according to the various driving condition. Using the precision analysis model reflecting the dynamic characteristic is contributed to increase accuracy in numerical simulation. In this paper, we conduct a numerical simulation to apply precision analysis model, which reflects the dynamic characteristic of bridge using Bridge Weigh-in-Motion technique and suggest overload vehicle enforcement technology using precision analysis model.

Keywords: bridge weigh-in-motion(BWIM) system, precision analysis model, dynamic characteristic of bridge, numerical simulation

Procedia PDF Downloads 277
26978 Hydrochemical Contamination Profiling and Spatial-Temporal Mapping with the Support of Multivariate and Cluster Statistical Analysis

Authors: Sofia Barbosa, Mariana Pinto, José António Almeida, Edgar Carvalho, Catarina Diamantino

Abstract:

The aim of this work was to test a methodology able to generate spatial-temporal maps that can synthesize simultaneously the trends of distinct hydrochemical indicators in an old radium-uranium tailings dam deposit. Multidimensionality reduction derived from principal component analysis and subsequent data aggregation derived from clustering analysis allow to identify distinct hydrochemical behavioural profiles and to generate synthetic evolutionary hydrochemical maps.

Keywords: Contamination plume migration, K-means of PCA scores, groundwater and mine water monitoring, spatial-temporal hydrochemical trends

Procedia PDF Downloads 209
26977 Applicability of Linearized Model of Synchronous Generator for Power System Stability Analysis

Authors: J. Ritonja, B. Grcar

Abstract:

For the synchronous generator simulation and analysis and for the power system stabilizer design and synthesis a mathematical model of synchronous generator is needed. The model has to accurately describe dynamics of oscillations, while at the same time has to be transparent enough for an analysis and sufficiently simplified for design of control system. To study the oscillations of the synchronous generator against to the rest of the power system, the model of the synchronous machine connected to an infinite bus through a transmission line having resistance and inductance is needed. In this paper, the linearized reduced order dynamic model of the synchronous generator connected to the infinite bus is presented and analysed in details. This model accurately describes dynamics of the synchronous generator only in a small vicinity of an equilibrium state. With the digression from the selected equilibrium point the accuracy of this model is decreasing considerably. In this paper, the equations’ descriptions and the parameters’ determinations for the linearized reduced order mathematical model of the synchronous generator are explained and summarized and represent the useful origin for works in the areas of synchronous generators’ dynamic behaviour analysis and synchronous generator’s control systems design and synthesis. The main contribution of this paper represents the detailed analysis of the accuracy of the linearized reduced order dynamic model in the entire synchronous generator’s operating range. Borders of the areas where the linearized reduced order mathematical model represents accurate description of the synchronous generator’s dynamics are determined with the systemic numerical analysis. The thorough eigenvalue analysis of the linearized models in the entire operating range is performed. In the paper, the parameters of the linearized reduced order dynamic model of the laboratory salient poles synchronous generator were determined and used for the analysis. The theoretical conclusions were confirmed with the agreement of experimental and simulation results.

Keywords: eigenvalue analysis, mathematical model, power system stability, synchronous generator

Procedia PDF Downloads 233
26976 NextCovps: Design and Stress Analysis of Dome Composite Overwrapped Pressure Vessels using Geodesic Trajectory Approach

Authors: Ammar Maziz, Prateek Gupta, Thiago Vasconcellos Birro, Benoit Gely

Abstract:

Hydrogen as a sustainable fuel has the highest energy density per mass as compared to conventional non-renewable sources. As the world looks to move towards sustainability, especially in the sectors of aviation and automotive, it becomes important to address the issue of storage of hydrogen as compressed gas in high-pressure tanks. To improve the design for the efficient storage and transportation of Hydrogen, this paper presents the design and stress analysis of Dome Composite Overwrapped Pressure Vessels (COPVs) using the geodesic trajectory approach. The geodesic trajectory approach is used to optimize the dome design, resulting in a lightweight and efficient structure. Python scripting is employed to implement the mathematical modeling of the COPV, and after validating the model by comparison to the published paper, stress analysis is conducted using Abaqus commercial code. The results demonstrate the effectiveness of the geodesic trajectory approach in achieving a lightweight and structurally sound dome design, as well as the accuracy and reliability of the stress analysis using Abaqus commercial code. This study provides insights into the design and analysis of COPVs for aerospace applications, with the potential for further optimization and application in other industries.

Keywords: composite overwrapped pressure vessels, carbon fiber, geodesic trajectory approach, dome design, stress analysis, plugin python

Procedia PDF Downloads 82
26975 Introducing, Testing, and Evaluating a Unified JavaScript Framework for Professional Online Studies

Authors: Caspar Goeke, Holger Finger, Dorena Diekamp, Peter König

Abstract:

Online-based research has recently gained increasing attention from various fields of research in the cognitive sciences. Technological advances in the form of online crowdsourcing (Amazon Mechanical Turk), open data repositories (Open Science Framework), and online analysis (Ipython notebook) offer rich possibilities to improve, validate, and speed up research. However, until today there is no cross-platform integration of these subsystems. Furthermore, implementation of online studies still suffers from the complex implementation (server infrastructure, database programming, security considerations etc.). Here we propose and test a new JavaScript framework that enables researchers to conduct any kind of behavioral research in the browser without the need to program a single line of code. In particular our framework offers the possibility to manipulate and combine the experimental stimuli via a graphical editor, directly in the browser. Moreover, we included an action-event system that can be used to handle user interactions, interactively change stimuli properties or store participants’ responses. Besides traditional recordings such as reaction time, mouse and keyboard presses, the tool offers webcam based eye and face-tracking. On top of these features our framework also takes care about the participant recruitment, via crowdsourcing platforms such as Amazon Mechanical Turk. Furthermore, the build in functionality of google translate will ensure automatic text translations of the experimental content. Thereby, thousands of participants from different cultures and nationalities can be recruited literally within hours. Finally, the recorded data can be visualized and cleaned online, and then exported into the desired formats (csv, xls, sav, mat) for statistical analysis. Alternatively, the data can also be analyzed online within our framework using the integrated Ipython notebook. The framework was designed such that studies can be used interchangeably between researchers. This will support not only the idea of open data repositories but also constitutes the possibility to share and reuse the experimental designs and analyses such that the validity of the paradigms will be improved. Particularly, sharing and integrating the experimental designs and analysis will lead to an increased consistency of experimental paradigms. To demonstrate the functionality of the framework we present the results of a pilot study in the field of spatial navigation that was conducted using the framework. Specifically, we recruited over 2000 subjects with various cultural backgrounds and consequently analyzed performance difference in dependence on the factors culture, gender and age. Overall, our results demonstrate a strong influence of cultural factors in spatial cognition. Such an influence has not yet been reported before and would not have been possible to show without the massive amount of data collected via our framework. In fact, these findings shed new lights on cultural differences in spatial navigation. As a consequence we conclude that our new framework constitutes a wide range of advantages for online research and a methodological innovation, by which new insights can be revealed on the basis of massive data collection.

Keywords: cultural differences, crowdsourcing, JavaScript framework, methodological innovation, online data collection, online study, spatial cognition

Procedia PDF Downloads 242
26974 Automatic Aggregation and Embedding of Microservices for Optimized Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservices are a software development methodology in which applications are built by composing a set of independently deploy-able, small, modular services. Each service runs a unique process and it gets instantiated and deployed in one or more machines (we assume that different microservices are deployed into different machines). Microservices are becoming the de facto standard for developing distributed cloud applications due to their reduced release cycles. In principle, the responsibility of a microservice can be as simple as implementing a single function, which can lead to the following issues: - Resource fragmentation due to the virtual machine boundary. - Poor communication performance between microservices. Two composition techniques can be used to optimize resource fragmentation and communication performance: aggregation and embedding of microservices. Aggregation allows the deployment of a set of microservices on the same machine using a proxy server. Aggregation helps to reduce resource fragmentation, and is particularly useful when the aggregated services have a similar scalability behavior. Embedding deals with communication performance by deploying on the same virtual machine those microservices that require a communication channel (localhost bandwidth is reported to be about 40 times faster than cloud vendor local networks and it offers better reliability). Embedding can also reduce dependencies on load balancer services since the communication takes place on a single virtual machine. For example, assume that microservice A has two instances, a1 and a2, and it communicates with microservice B, which also has two instances, b1 and b2. One embedding can deploy a1 and b1 on machine m1, and a2 and b2 are deployed on a different machine m2. This deployment configuration allows each pair (a1-b1), (a2-b2) to communicate using the localhost interface without the need of a load balancer between microservices A and B. Aggregation and embedding techniques are complex since different microservices might have incompatible runtime dependencies which forbid them from being installed on the same machine. There is also a security concern since the attack surface between microservices can be larger. Luckily, container technology allows to run several processes on the same machine in an isolated manner, solving the incompatibility of running dependencies and the previous security concern, thus greatly simplifying aggregation/embedding implementations by just deploying a microservice container on the same machine as the aggregated/embedded microservice container. Therefore, a wide variety of deployment configurations can be described by combining aggregation and embedding to create an efficient and robust microservice architecture. This paper presents a formal method that receives a declarative definition of a microservice architecture and proposes different optimized deployment configurations by aggregating/embedding microservices. The first prototype is based on i2kit, a deployment tool also submitted to ICWS 2018. The proposed prototype optimizes the following parameters: network/system performance, resource usage, resource costs and failure tolerance.

Keywords: aggregation, deployment, embedding, resource allocation

Procedia PDF Downloads 189
26973 Discriminant Analysis as a Function of Predictive Learning to Select Evolutionary Algorithms in Intelligent Transportation System

Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, Daniel Vélez-Díaz, Edith Olaco García

Abstract:

In this paper, we present the use of the discriminant analysis to select evolutionary algorithms that better solve instances of the vehicle routing problem with time windows. We use indicators as independent variables to obtain the classification criteria, and the best algorithm from the generic genetic algorithm (GA), random search (RS), steady-state genetic algorithm (SSGA), and sexual genetic algorithm (SXGA) as the dependent variable for the classification. The discriminant classification was trained with classic instances of the vehicle routing problem with time windows obtained from the Solomon benchmark. We obtained a classification of the discriminant analysis of 66.7%.

Keywords: Intelligent Transportation Systems, data-mining techniques, evolutionary algorithms, discriminant analysis, machine learning

Procedia PDF Downloads 452
26972 Tga Analysis on the Decomposition of Active Material of Aquilaria Malaccencis

Authors: Nurshafika Adira Bt Audi Ashraf, Habsah Alwi

Abstract:

This study describes the series of analysis conducted after the use of Vacuum far Infra Red. Parameter including the constant drying temperature at 40°C with pressure difference (-400 bar, -500 bar and -600 bar) and constant drying pressure at -400 bar with difference temperature (40°C, 50°C and 60°C). The dried leaves with constant temperature and constant pressure is compared with the fresh leaves via several analysis including TGA, FTIR and Chromameter. Results indicated that the fresh leaves shows three degradation stages while temperature constant shows four stages of degradation and at constant pressure of -400 bar, five stages of degradation is shown. However, at the temperature constant with pressure -500 bar, five degradation stages are identified and at constant pressure with temperature 40°C, three stage of degradation is presence. It is assumed that it is due to the difference size of the sample as the particle size is decrease, the peak temperature shown in TG curves is also decrease which lead to the rapid ignition. Based on the FTIR analysis, fresh leaves gives the high presence of O-H and C=O group where both of the constant parameters give the absence of those due to the drying effects. In color analysis, the constant drying parameters (pressure and temperature) both shows that as the temperature increases, the average total of color change is also increases.

Keywords: chromameter, FTIR, TGA, Vaccum far infrared dying

Procedia PDF Downloads 348
26971 Model Estimation and Error Level for Okike’s Merged Irregular Transposition Cipher

Authors: Okike Benjamin, Garba E. J. D.

Abstract:

The researcher has developed a new encryption technique known as Merged Irregular Transposition Cipher. In this cipher method of encryption, a message to be encrypted is split into parts and each part encrypted separately. Before the encrypted message is transmitted to the recipient(s), the positions of the split in the encrypted messages could be swapped to ensure more security. This work seeks to develop a model by considering the split number, S and the average number of characters per split, L as the message under consideration is split from 2 through 10. Again, after developing the model, the error level in the model would be determined.

Keywords: merged irregular transposition, error level, model estimation, message splitting

Procedia PDF Downloads 296
26970 Optimization of Electrical Discharge Machining Parameters in Machining AISI D3 Tool Steel by Grey Relational Analysis

Authors: Othman Mohamed Altheni, Abdurrahman Abusaada

Abstract:

This study presents optimization of multiple performance characteristics [material removal rate (MRR), surface roughness (Ra), and overcut (OC)] of hardened AISI D3 tool steel in electrical discharge machining (EDM) using Taguchi method and Grey relational analysis. Machining process parameters selected were pulsed current Ip, pulse-on time Ton, pulse-off time Toff and gap voltage Vg. Based on ANOVA, pulse current is found to be the most significant factor affecting EDM process. Optimized process parameters are simultaneously leading to a higher MRR, lower Ra, and lower OC are then verified through a confirmation experiment. Validation experiment shows an improved MRR, Ra and OC when Taguchi method and grey relational analysis were used

Keywords: edm parameters, grey relational analysis, Taguchi method, ANOVA

Procedia PDF Downloads 283
26969 A Contrastive Analysis on Hausa and Yoruba Adjectival Phrases

Authors: Abubakar Maikudi

Abstract:

Contrastive analysis is the method of analyzing the structure of any two languages with a view to determining the possible differential aspects of their systems irrespective of their genetic affinity or level of development. Contrastive analysis of two languages becomes useful when it is adequately describing the sound structure and grammatical structure of two languages, with comparative statements giving emphasis to the compatible items in the two systems. This research work uses comparative analysis theory to analyze adjective and adjectival phrases in Hausa and Yorùbá languages. The Hausa language belongs to the Chadic family of the Afro-Asiatic phylum, while the Yorùbá language belongs to the Benue-Congo family of the Niger-Congo phylum. The findings of the research clearly demonstrated that there are significant similarities in the adjectival phrase constructions of the two languages, i.e., nominal (Head) and post-nominal (Post-Head) use of the adjective, predicative function of an adjective, use of the reduplicative adjective, use of the comparative and superlative adjective, etc. However, there are dissimilarities in the adjectival phrase of the two languages in gender/number agreement and pre-nominal (Post-Head) use of adjectives.

Keywords: genetic affinity, contrastive analysis, phylum, pre-head, post-head

Procedia PDF Downloads 212
26968 An Application of Content Analysis, SWOT Analysis, and the TOPSIS Method: A Case Study of the 'Tourism Ambassador' Program in Indonesia

Authors: Gilang Maulana Majid

Abstract:

If a government program remains scientifically uncontested for a long time, it is likely that its effects will be far from expected as there is no concrete evaluation of the steps being taken. This article identifies how such a theory aptly describes the case of the 'tourism ambassador' program in Indonesia. Being set out as one of the tourism promotional means of many regional governments in Indonesia, this program is heavily criticized for being ineffective despite a large number of budgets being spent on an annual basis. Taking the program as a case study, this article applies content analysis, SWOT analysis, and TOPSIS as data analysis methods, with a total of 56 tourism ambassadors invited to become coders, respondents, and/or interviewees in this research. The study reveals the SWOT of the program, recognizes four strategies that can be taken to optimize the program's effects and prioritizes a strategy based on the preferences of the involved tourism ambassadors using TOPSIS. It is found that incorporation of technology such as the creation of an online platform is, among others, the most expected approach to be taken to solve the problems concerning tourism ambassador program. However, based on the costs and benefits of each strategy presented in the current study, each alternative appears to have trade-offs between one and another.

Keywords: Indonesia, optimization strategies, 'Tourism Ambassador' program, SWOT-TOPSIS

Procedia PDF Downloads 147
26967 Insider Theft Detection in Organizations Using Keylogger and Machine Learning

Authors: Shamatha Shetty, Sakshi Dhabadi, Prerana M., Indushree B.

Abstract:

About 66% of firms claim that insider attacks are more likely to happen. The frequency of insider incidents has increased by 47% in the last two years. The goal of this work is to prevent dangerous employee behavior by using keyloggers and the Machine Learning (ML) model. Every keystroke that the user enters is recorded by the keylogging program, also known as keystroke logging. Keyloggers are used to stop improper use of the system. This enables us to collect all textual data, save it in a CSV file, and analyze it using an ML algorithm and the VirusTotal API. Many large companies use it to methodically monitor how their employees use computers, the internet, and email. We are utilizing the SVM algorithm and the VirusTotal API to improve overall efficiency and accuracy in identifying specific patterns and words to automate and offer the report for improved monitoring.

Keywords: cyber security, machine learning, cyclic process, email notification

Procedia PDF Downloads 41
26966 Introducing Information and Communication Technologies in Prison: A Proposal in Favor of Social Reintegration

Authors: Carmen Rocio Fernandez Diaz

Abstract:

This paper focuses on the relevance of information and communication technologies (hereinafter referred as ‘ICTs’) as an essential part of the day-to-day life of all societies nowadays, as they offer the scenario where an immense number of behaviors are performed that previously took place in the physical world. In this context, areas of reality that have remained outside the so-called ‘information society’ are hardly imaginable. Nevertheless, it is possible to identify a means that continue to be behind this reality, and it is the penitentiary area regarding inmates rights, as security aspects in prison have already be improved by new technologies. Introducing ICTs in prisons is still a matter subject to great rejections. The study of comparative penitentiary systems worldwide shows that most of them use ICTs only regarding educational aspects of life in prison and that communications with the outside world are generally based on traditional ways. These are only two examples of the huge range of activities where ICTs can carry positive results within the prison. Those positive results have to do with the social reintegration of persons serving a prison sentence. Deprivation of liberty entails contact with the prison subculture and the harmful effects of it, causing in cases of long-term sentences the so-called phenomenon of ‘prisonization’. This negative effect of imprisonment could be reduced if ICTs were used inside prisons in the different areas where they can have an impact, and which are treated in this research, as (1) access to information and culture, (2) basic and advanced training, (3) employment, (4) communication with the outside world, (5) treatment or (6) leisure and entertainment. The content of all of these areas could be improved if ICTs were introduced in prison, as it is shown by the experience of some prisons of Belgium, United Kingdom or The United States. However, rejections to introducing ICTs in prisons obey to the fact that it could carry also risks concerning security and the commission of new offences. Considering these risks, the scope of this paper is to offer a real proposal to introduce ICTs in prison, trying to avoid those risks. This enterprise would be done to take advantage of the possibilities that ICTs offer to all inmates in order to start to build a life outside which is far from delinquency, but mainly to those inmates who are close to release. Reforming prisons in this sense is considered by the author of this paper an opportunity to offer inmates a progressive resettlement to live in freedom with a higher possibility to obey the law and to escape from recidivism. The value that new technologies would add to education, employment, communications or treatment to a person deprived of liberty constitutes a way of humanization of prisons in the 21st century.

Keywords: deprivation of freedom, information and communication technologies, imprisonment, social reintegration

Procedia PDF Downloads 151
26965 Development of Enhanced Data Encryption Standard

Authors: Benjamin Okike

Abstract:

There is a need to hide information along the superhighway. Today, information relating to the survival of individuals, organizations, or government agencies is transmitted from one point to another. Adversaries are always on the watch along the superhighway to intercept any information that would enable them to inflict psychological ‘injuries’ to their victims. But with information encryption, this can be prevented completely or at worst reduced to the barest minimum. There is no doubt that so many encryption techniques have been proposed, and some of them are already being implemented. However, adversaries always discover loopholes on them to perpetuate their evil plans. In this work, we propose the enhanced data encryption standard (EDES) that would deploy randomly generated numbers as an encryption method. Each time encryption is to be carried out, a new set of random numbers would be generated, thereby making it almost impossible for cryptanalysts to decrypt any information encrypted with this newly proposed method.

Keywords: encryption, enhanced data encryption, encryption techniques, information security

Procedia PDF Downloads 133
26964 Using Social Network Analysis for Cyber Threat Intelligence

Authors: Vasileios Anastopoulos

Abstract:

Cyber threat intelligence assists organizations in understanding the threats they face and helps them make educated decisions on preparing their defenses. Sharing of threat intelligence and threat information is increasingly leveraged by organizations and enterprises, and various software solutions are already available, with the open-source malware information sharing platform (MISP) being a popular one. In this work, a methodology for the production of cyber threat intelligence using the threat information stored in MISP is proposed. The methodology leverages the discipline of social network analysis and the diamond model, a model used for intrusion analysis, to produce cyber threat intelligence. The workings are demonstrated with a case study on a production MISP instance of a real organization. The paper concluded with a discussion on the proposed methodology and possible directions for further research.

Keywords: cyber threat intelligence, diamond model, malware information sharing platform, social network analysis

Procedia PDF Downloads 148
26963 Implementing Fault Tolerance with Proxy Signature on the Improvement of RSA System

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Fault tolerance and data security are two important issues in modern communication systems. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on the improved RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.

Keywords: fault tolerance, improved RSA, key agreement, proxy signature

Procedia PDF Downloads 407