Search results for: information warfare techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16307

Search results for: information warfare techniques

15107 Analysis of the Significance of Multimedia Channels Using Sparse PCA and Regularized SVD

Authors: Kourosh Modarresi

Abstract:

The abundance of media channels and devices has given users a variety of options to extract, discover, and explore information in the digital world. Since, often, there is a long and complicated path that a typical user may venture before taking any (significant) action (such as purchasing goods and services), it is critical to know how each node (media channel) in the path of user has contributed to the final action. In this work, the significance of each media channel is computed using statistical analysis and machine learning techniques. More specifically, “Regularized Singular Value Decomposition”, and “Sparse Principal Component” has been used to compute the significance of each channel toward the final action. The results of this work are a considerable improvement compared to the present approaches.

Keywords: multimedia attribution, sparse principal component, regularization, singular value decomposition, feature significance, machine learning, linear systems, variable shrinkage

Procedia PDF Downloads 306
15106 The Impact of the Covid-19 Crisis on the Information Behavior in the B2B Buying Process

Authors: Stehr Melanie

Abstract:

The availability of apposite information is essential for the decision-making process of organizational buyers. Due to the constraints of the Covid-19 crisis, information channels that emphasize face-to-face contact (e.g. sales visits, trade shows) have been unavailable, and usage of digitally-driven information channels (e.g. videoconferencing, platforms) has skyrocketed. This paper explores the question in which areas the pandemic induced shift in the use of information channels could be sustainable and in which areas it is a temporary phenomenon. While information and buying behavior in B2C purchases has been regularly studied in the last decade, the last fundamental model of organizational buying behavior in B2B was introduced by Johnston and Lewin (1996) in times before the advent of the internet. Subsequently, research efforts in B2B marketing shifted from organizational buyers and their decision and information behavior to the business relationships between sellers and buyers. This study builds on the extensive literature on situational factors influencing organizational buying and information behavior and uses the economics of information theory as a theoretical framework. The research focuses on the German woodworking industry, which before the Covid-19 crisis was characterized by a rather low level of digitization of information channels. By focusing on an industry with traditional communication structures, a shift in information behavior induced by an exogenous shock is considered a ripe research setting. The study is exploratory in nature. The primary data source is 40 in-depth interviews based on the repertory-grid method. Thus, 120 typical buying situations in the woodworking industry and the information and channels relevant to them are identified. The results are combined into clusters, each of which shows similar information behavior in the procurement process. In the next step, the clusters are analyzed in terms of the post and pre-Covid-19 crisis’ behavior identifying stable and dynamic information behavior aspects. Initial results show that, for example, clusters representing search goods with low risk and complexity suggest a sustainable rise in the use of digitally-driven information channels. However, in clusters containing trust goods with high significance and novelty, an increased return to face-to-face information channels can be expected after the Covid-19 crisis. The results are interesting from both a scientific and a practical point of view. This study is one of the first to apply the economics of information theory to organizational buyers and their decision and information behavior in the digital information age. Especially the focus on the dynamic aspects of information behavior after an exogenous shock might contribute new impulses to theoretical debates related to the economics of information theory. For practitioners - especially suppliers’ marketing managers and intermediaries such as publishers or trade show organizers from the woodworking industry - the study shows wide-ranging starting points for a future-oriented segmentation of their marketing program by highlighting the dynamic and stable preferences of elaborated clusters in the choice of their information channels.

Keywords: B2B buying process, crisis, economics of information theory, information channel

Procedia PDF Downloads 182
15105 A Sui Generis Technique to Detect Pathogens in Post-Partum Breast Milk Using Image Processing Techniques

Authors: Yogesh Karunakar, Praveen Kandaswamy

Abstract:

Mother’s milk provides the most superior source of nutrition to a child. There is no other substitute to the mother’s milk. Postpartum secretions like breast milk can be analyzed on the go for testing the presence of any harmful pathogen before a mother can feed the child or donate the milk for the milk bank. Since breast feeding is one of the main causes for transmission of diseases to the newborn, it is mandatory to test the secretions. In this paper, we describe the detection of pathogens like E-coli, Human Immunodeficiency Virus (HIV), Hepatitis B (HBV), Hepatitis C (HCV), Cytomegalovirus (CMV), Zika and Ebola virus through an innovative method, in which we are developing a unique chip for testing the mother’s milk sample. The chip will contain an antibody specific to the target pathogen that will show a color change if there are enough pathogens present in the fluid that will be considered dangerous. A smart-phone camera will then be acquiring the image of the strip and using various image processing techniques we will detect the color development due to antigen antibody interaction within 5 minutes, thereby not adding to any delay, before the newborn is fed or prior to the collection of the milk for the milk bank. If the target pathogen comes positive through this method, then the health care provider can provide adequate treatment to bring down the number of pathogens. This will reduce the postpartum related mortality and morbidity which arises due to feeding infectious breast milk to own child.

Keywords: postpartum, fluids, camera, HIV, HCV, CMV, Zika, Ebola, smart-phones, breast milk, pathogens, image processing techniques

Procedia PDF Downloads 219
15104 The Use Management of the Knowledge Management and the Information Technologies in the Competitive Strategy of a Self-Propelling Industry

Authors: Guerrero Ramírez Sandra, Ramos Salinas Norma Maricela, Muriel Amezcua Vanesa

Abstract:

This article presents the beginning of a wider study that intends to demonstrate how within organizations of the automotive industry from the city of Querétaro. Knowledge management and technological management are required, as well as people’s initiative and the interaction embedded at the interior of it, with the appropriate environment that facilitates information conversion with wide information technologies management (ITM) range. A company was identified for the pilot study of this research, where descriptive and inferential research information was obtained. The results of the pilot suggest that some respondents did noted entity the knowledge management topic, even if staffs have access to information technology (IT) that serve to enhance access to knowledge (through internet, email, databases, external and internal company personnel, suppliers, customers and competitors) data, this implicates that there are Knowledge Management (KM) problems. The data shows that academically well-prepared organizations normally do not recognize the importance of knowledge in the business, nor in the implementation of it, which at the end is a great influence on how to manage it, so that it should guide the company to greater in sight towards a competitive strategy search, given that the company has an excellent technological infrastructure and KM was not exploited. Cultural diversity is another factor that was observed by the staff.

Keywords: Knowledge Management (KM), Technological Knowledge Management (TKM), Technology Information Management (TI), access to knowledge

Procedia PDF Downloads 496
15103 A Location-Based Search Approach According to Users’ Application Scenario

Authors: Shih-Ting Yang, Chih-Yun Lin, Ming-Yu Li, Jhong-Ting Syue, Wei-Ming Huang

Abstract:

Global positioning system (GPS) has become increasing precise in recent years, and the location-based service (LBS) has developed rapidly. Take the example of finding a parking lot (such as Parking apps). The location-based service can offer immediate information about a nearby parking lot, including the information about remaining parking spaces. However, it cannot provide expected search results according to the requirement situations of users. For that reason, this paper develops a “Location-based Search Approach according to Users’ Application Scenario” according to the location-based search and demand determination to help users obtain the information consistent with their requirements. The “Location-based Search Approach based on Users’ Application Scenario” of this paper consists of one mechanism and three kernel modules. First, in the Information Pre-processing Mechanism (IPM), this paper uses the cosine theorem to categorize the locations of users. Then, in the Information Category Evaluation Module (ICEM), the kNN (k-Nearest Neighbor) is employed to classify the browsing records of users. After that, in the Information Volume Level Determination Module (IVLDM), this paper makes a comparison between the number of users’ clicking the information at different locations and the average number of users’ clicking the information at a specific location, so as to evaluate the urgency of demand; then, the two-dimensional space is used to estimate the application situations of users. For the last step, in the Location-based Search Module (LBSM), this paper compares all search results and the average number of characters of the search results, categorizes the search results with the Manhattan Distance, and selects the results according to the application scenario of users. Additionally, this paper develops a Web-based system according to the methodology to demonstrate practical application of this paper. The application scenario-based estimate and the location-based search are used to evaluate the type and abundance of the information expected by the public at specific location, so that information demanders can obtain the information consistent with their application situations at specific location.

Keywords: data mining, knowledge management, location-based service, user application scenario

Procedia PDF Downloads 118
15102 Reliving Historical Events Using Augmented Reality Techniques

Authors: Josep Domenech Mingot, Francisco Javier Esclapes Jover

Abstract:

The arrival of the age of information and new technologies allowed humanity to see what the future has in store, but occasionally it also brings the opportunity to look through a window to the past, an opportunity to relive history. This paper introduces a prototype of a digital system that lets us peek into our past making use of augmented reality technologies. A 3D scene will be modeled and animated based on an old image, depicting an event of historical significance. From this scene, a video will be rendered, recreating the events that were taking place at the time. Also, a smartphone app will be created. This app will detect the original image with the smartphone’s camera, overlay the rendered video so that it fully covers it and track the detected image, so that the overlaying video can keep covering the image. The recreation of Alicante’s Central Market bombing during the Spanish Civil War is presented as a case study.

Keywords: augmented reality, digital heritage, history, multimedia, smartphone

Procedia PDF Downloads 216
15101 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model

Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li

Abstract:

Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.

Keywords: spatial information network, traffic prediction, wavelet decomposition, time series model

Procedia PDF Downloads 140
15100 Evaluation of the Performance of Solar Stills as an Alternative for Brine Treatment Applying the Monte Carlo Ray Tracing Method

Authors: B. E. Tarazona-Romero, J. G. Ascanio-Villabona, O. Lengerke-Perez, A. D. Rincon-Quintero, C. L. Sandoval-Rodriguez

Abstract:

Desalination offers solutions for the shortage of water in the world, however, the process of eliminating salts generates a by-product known as brine, generally eliminated in the environment through techniques that mitigate its impact. Brine treatment techniques are vital to developing an environmentally sustainable desalination process. Consequently, this document evaluates three different geometric configurations of solar stills as an alternative for brine treatment to be integrated into a low-scale desalination process. The geometric scenarios to be studied were selected because they have characteristics that adapt to the concept of appropriate technology; low cost, intensive labor and material resources for local manufacturing, modularity, and simplicity in construction. Additionally, the conceptual design of the collectors was carried out, and the ray tracing methodology was applied through the open access software SolTrace and Tonatiuh. The simulation process used 600.00 rays and modified two input parameters; direct normal radiation (DNI) and reflectance. In summary, for the scenarios evaluated, the ladder-type distiller presented higher efficiency values compared to the pyramid-type and single-slope collectors. Finally, the efficiency of the collectors studied was directly related to their geometry, that is, large geometries allow them to receive a greater number of solar rays in various paths, affecting the efficiency of the device.

Keywords: appropriate technology, brine treatment techniques, desalination, monte carlo ray tracing

Procedia PDF Downloads 66
15099 The Use of Degradation Measures to Design Reliability Test Plans

Authors: Stephen V. Crowder, Jonathan W. Lane

Abstract:

With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. In this work we present a case study involving an electronic component subject to degradation. The data, consisting of 42 degradation paths of cycles to failure, are first used to estimate a reliability function. Bootstrapping techniques are then used to perform power studies and develop a minimal reliability test plan for future production of this component.

Keywords: degradation measure, time to failure distribution, bootstrap, computational science

Procedia PDF Downloads 528
15098 Continuance Intention to Use E-administration Information Portal by Non-teaching Staff in Selected Universities, Southwest, Nigeria

Authors: Adebayo Muritala Adegbore

Abstract:

The e-administration is increasingly being recognized as an important phenomenon in this 21st century and its place in society both at the public and private levels cannot be downplayed. Of close attention is how these platforms are adopted and used in academia due to academia’s role in shaping the overall development of the society, particularly the administrative activities of the non-teaching staff in universities since much has not been done to find out the continuance intention to use e-administration information portal by non-teaching staff in universities. This study, therefore, investigates the continuance intention to use e-administration of information portals of senior non-teaching staff in selected universities in southwest Nigeria. The study’s design was a correlational survey using simple random sampling to select three hundred and fifty-two (352) senior non-teaching staff in the selected universities. A standardized questionnaire was used for data capturing while data were analyzed using the descriptive statistics of frequency counts, percentages, means, and standard deviation for the research questions and the Pearson Product Moment Correlation was used for the hypothesis. Findings revealed that the continuance intention of senior non-teaching staff to use e-administration information portal is positive (x = 3.13), the university portal is one of the most utilized e-administration tools (83.4%), while there was an inversely significant relationship between continuance intention to use and use of e-administration information portal (r = -.254; p< 0.05; N = 320).

Keywords: e-administration, e-portal, non-teaching staff, information systems, continuance intention, use of e-administration portals

Procedia PDF Downloads 190
15097 Numerical Modeling for Water Engineering and Obstacle Theory

Authors: Mounir Adal, Baalal Azeddine, Afifi Moulay Larbi

Abstract:

Numerical analysis is a branch of mathematics devoted to the development of iterative matrix calculation techniques. We are searching for operations optimization as objective to calculate and solve systems of equations of order n with time and energy saving for computers that are conducted to calculate and analyze big data by solving matrix equations. Furthermore, this scientific discipline is producing results with a margin of error of approximation called rates. Thus, the results obtained from the numerical analysis techniques that are held on computer software such as MATLAB or Simulink offers a preliminary diagnosis of the situation of the environment or space targets. By this we can offer technical procedures needed for engineering or scientific studies exploitable by engineers for water.

Keywords: numerical analysis methods, obstacles solving, engineering, simulation, numerical modeling, iteration, computer, MATLAB, water, underground, velocity

Procedia PDF Downloads 459
15096 Quantitative Comparisons of Different Approaches for Rotor Identification

Authors: Elizabeth M. Annoni, Elena G. Tolkacheva

Abstract:

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.

Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors

Procedia PDF Downloads 321
15095 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia PDF Downloads 132
15094 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 366
15093 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection

Authors: S. Shankar Bharathi

Abstract:

Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.

Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision

Procedia PDF Downloads 424
15092 External Retinal Prosthesis Image Processing System Used One-Cue Saliency Map Based on DSP

Authors: Yili Chen, Jixiang Fu, Zhihua Liu, Zhicheng Zhang, Rongmao Li, Nan Fu, Yaoqin Xie

Abstract:

Retinal prothesis is designed to help the blind to get some sight.It is made up of internal part and external part.In external part ,there is made up of camera, image processing, and RF transmitter.In internal part, there is RF receiver, implant chip,micro-electrode.The image got from the camera should be processed by suitable stragies to corresponds to stimulus the electrode.Nowadays, the number of the micro-electrode is hundreds and we don’t know the mechanism how the elctrode stimulus the optic nerve, an easy way to the hypothesis is that the pixel in the image is correspondence to the electrode.So it is a question how to get the important information of the image captured from the picture.There are many strategies to experimented to get the most important information as soon as possible, due to the real time system.ROI is a useful algorithem to extract the region of the interest.Our paper will explain the details of the orinciples and functions of the ROI.And based on this, we simplified the ROI algrithem,and used it in outside image prcessing DSP system of the retinal prothesis.Results show that our image processing stratiges is suitable for real-time retinal prothesis and can cut redundant information and help useful information to express in the low-size image.

Keywords: image processing, region of interest, saliency map, low-size image, useful information express, cut redundant information in image

Procedia PDF Downloads 279
15091 Understanding Informal Settlements: The Role of Geo-Information Tools

Authors: Musyimi Mbathi

Abstract:

Information regarding social, political, demographic, economic and other attributes of human settlement is important for decision makers at all levels of planning, as they have to grapple with dynamic environments often associated with settlements. At the local level, it is particularly important for both communities and urban managers to have accurate and reliable information regarding all planning attributes. Settlement mapping, in particular, informal settlements mapping in Kenya, has over the past few years been carried out using modern tools like Geographic information systems (GIS) and remote sensing for spatial data analysis and planning. GIS tools offer a platform for integration of spatial and non-spatial data as well as visualisation of the settlements. The capabilities offered by these tools have enabled communities to participate especially in the planning and management of new infrastructure as well as settlement upgrading. Land tenure based projects within informal settlements have also relied on GIS and related tools with considerable success. Additionally, the adoption of participatory approaches and use of geo-information tools helped to provide a basis for all inclusive planning thus promoting accountability, transparency, legitimacy, and other dimensions of governance within human settlement planning. The paper examines the context and application of geo-information tools for planning within low-income settlements of Kenya. A case study of Kiambiu settlement will be used to demonstrate how the tools have been applied for planning and decision-making purposes.

Keywords: informal settlements, GIS, governance, modern tools

Procedia PDF Downloads 490
15090 Optical Flow Technique for Supersonic Jet Measurements

Authors: Haoxiang Desmond Lim, Jie Wu, Tze How Daniel New, Shengxian Shi

Abstract:

This paper outlines the development of a novel experimental technique in quantifying supersonic jet flows, in an attempt to avoid seeding particle problems frequently associated with particle-image velocimetry (PIV) techniques at high Mach numbers. Based on optical flow algorithms, the idea behind the technique involves using high speed cameras to capture Schlieren images of the supersonic jet shear layers, before they are subjected to an adapted optical flow algorithm based on the Horn-Schnuck method to determine the associated flow fields. The proposed method is capable of offering full-field unsteady flow information with potentially higher accuracy and resolution than existing point-measurements or PIV techniques. Preliminary study via numerical simulations of a circular de Laval jet nozzle successfully reveals flow and shock structures typically associated with supersonic jet flows, which serve as useful data for subsequent validation of the optical flow based experimental results. For experimental technique, a Z-type Schlieren setup is proposed with supersonic jet operated in cold mode, stagnation pressure of 8.2 bar and exit velocity of Mach 1.5. High-speed single-frame or double-frame cameras are used to capture successive Schlieren images. As implementation of optical flow technique to supersonic flows remains rare, the current focus revolves around methodology validation through synthetic images. The results of validation test offers valuable insight into how the optical flow algorithm can be further improved to improve robustness and accuracy. Details of the methodology employed and challenges faced will be further elaborated in the final conference paper should the abstract be accepted. Despite these challenges however, this novel supersonic flow measurement technique may potentially offer a simpler way to identify and quantify the fine spatial structures within the shock shear layer.

Keywords: Schlieren, optical flow, supersonic jets, shock shear layer

Procedia PDF Downloads 311
15089 Software Evolution Based Activity Diagrams

Authors: Zine-Eddine Bouras, Abdelouaheb Talai

Abstract:

During the last two decades, the software evolution community has intensively tackled the software merging issue whose main objective is to merge in a consistent way different versions of software in order to obtain a new version. Well-established approaches, mainly based on the dependence analysis techniques, have been used to bring suitable solutions. These approaches concern the source code or software architectures. However, these solutions are more expensive due to the complexity and size. In this paper, we overcome this problem by operating at a high level of abstraction. The objective of this paper is to investigate the software merging at the level of UML activity diagrams, which is a new interesting issue. Its purpose is to merge activity diagrams instead of source code. The proposed approach, based on dependence analysis techniques, is illustrated through an appropriate case study.

Keywords: activity diagram, activity diagram slicing, dependency analysis, software merging

Procedia PDF Downloads 323
15088 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 251
15087 Trace Network: A Probabilistic Relevant Pattern Recognition Approach to Attribution Trace Analysis

Authors: Jian Xu, Xiaochun Yun, Yongzheng Zhang, Yafei Sang, Zhenyu Cheng

Abstract:

Network attack prevention is a critical research area of information security. Network attack would be oppressed if attribution techniques are capable to trace back to the attackers after the hacking event. Therefore attributing these attacks to a particular identification becomes one of the important tasks when analysts attempt to differentiate and profile the attacker behind a piece of attack trace. To assist analysts in expose attackers behind the scenes, this paper researches on the connections between attribution traces and proposes probabilistic relevance based attribution patterns. This method facilitates the evaluation of the plausibility relevance between different traceable identifications. Furthermore, through analyzing the connections among traces, it could confirm the existence probability of a certain organization as well as discover its affinitive partners by the means of drawing relevance matrix from attribution traces.

Keywords: attribution trace, probabilistic relevance, network attack, attacker identification

Procedia PDF Downloads 361
15086 Dynamic Ambulance Deployment to Reduce Ambulance Response Times Using Geographic Information Systems

Authors: Masoud Swalehe, Semra Günay

Abstract:

Developed countries are losing many lives to non-communicable diseases as compared to their developing counterparts. The effects of these diseases are mostly sudden and manifest at a very short time prior to death or a dangerous attack and this has consolidated the significance of emergency medical system (EMS) as one of the vital areas of healthcare service delivery. The primary objective of this research is to reduce ambulance response times (RT) of Eskişehir province EMS since a number of studies have established a relationship between ambulance response times and survival chances of patients especially out of hospital cardiac arrest (OHCA) victims. It has been found out that patients who receive out of hospital medical attention in few (4) minutes after cardiac arrest because of low ambulance response times stand higher chances of survival than their counterparts who take longer times (more than 12 minutes) to receive out of hospital medical care because of higher ambulance response times. The study will make use of geographic information systems (GIS) technology to dynamically reallocate ambulance resources according to demand and time so as to reduce ambulance response times. Geospatial-time distribution of ambulance calls (demand) will be used as a basis for optimal ambulance deployment using system status management (SSM) strategy to achieve much demand coverage with the same number of ambulance resources to cause response time reduction. Drive-time polygons will be used to come up with time specific facility coverage areas and suggesting additional facility candidate sites where ambulance resources can be moved to serve higher demands making use of network analysis techniques. Emergency Ambulance calls’ data from 1st January 2014 to 31st December 2014 obtained from Eskişehir province health directorate will be used in this study. This study will focus on the reduction of ambulance response times which is a key Emergency Medical Services performance indicator.

Keywords: emergency medical services, system status management, ambulance response times, geographic information system, geospatial-time distribution, out of hospital cardiac arrest

Procedia PDF Downloads 297
15085 The Impact of Strategic Information in Developing the Target Cost Approach to achieve Competitive Advantage

Authors: Rizgar Abdullah Sabir Jaf, Bayan Sedeeq Azeez Hussin, Dler Moosa Ahmed Karim

Abstract:

Presently, economic and technological developments are growing faster in an unparalleled way. The result of that is innovative changing a great deal of a great deal of assumption, concepts, transactions, and high of competition between companies all over the world. The title of the thesis is one of the subjects that get large concerns in the financial and business world in the present time. That is because many competitive firms have appeared in the regional and global markets and the rapid changes that covered all fields of life. The subjects of the dissertation have a special importance in making the firm's businesses succeed in general and the industrial firms especially. Thus, the basic purpose of this study is to determine whether target costing is used in the costing application process in their customer expectation, profit margin, cost and price determination, cost reduction and management operations. In today’s intensely competitive and highly volatile business environment, consistent development of low cost and high quality products meeting the functional requirements is a key to a company's survival. Companies continuously strive to reduce the costs while still producing quality products to stay ahead in the competition. Many companies have turned to target costing to achieve this objective. The results indicate that there is a significant positive relationship (at the significance level less than 0.05) between the factors competitive advantage and management accounting techniques in the firm's sample study.

Keywords: strategic information, target cost, competitive advantage, Iraqi soft drink firms

Procedia PDF Downloads 301
15084 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram

Authors: Mehwish Asghar

Abstract:

Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.

Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence

Procedia PDF Downloads 220
15083 Non-Invasive Techniques for Management of Carious Primary Dentition Using Silver Diamine Fluoride and Moringa Extract as a Modification of the Hall Technique

Authors: Rasha F. Sharaf

Abstract:

Treatment of dental caries in young children is considered a great challenge for all dentists, especially with uncooperative children. Recently non-invasive techniques have been highlighted as they alleviate the need for local anesthesia and other painful procedures during management of carious teeth and, at the same time, increase the success rate of the treatment done. Silver Diamine Fluoride (SDF) is one of the most effective cariostatic materials that arrest the progression of carious lesions and aid in remineralizing the demineralized tooth structure. Both fluoride and silver ions proved to have an antibacterial action and aid in the precipitation of an insoluble layer that prevents further decay. At the same time, Moringa proved to have an effective antibacterial action against different types of bacteria, therefore, it can be used as a non-invasive technique for the management of caries in children. One of the important theories for the control of caries is by depriving the cariogenic bacteria from nutrients causing their starvation and death, which can be achieved by applying stainless steel crown on primary molars with carious lesions which are not involving the pulp, and this technique is known as Hall technique. The success rate of the Hall technique can be increased by arresting the carious lesion using either SDF or Moringa and gaining the benefit of their antibacterial action. Multiple clinical cases with 1 year follow up will be presented, comparing different treatment options, and using various materials and techniques for non-invasive and non-painful management of carious primary teeth.

Keywords: SDF, hall technique, carious primary teeth, moringa extract

Procedia PDF Downloads 92
15082 Study of Education Learning Techniques and Game Genres

Authors: Khadija Al Farei, Prakash Kumar, Vikas Rao Naidu

Abstract:

Games are being developed with different genres for different age groups, for many decades. In many places, educational games are playing a vital role for active classroom environment and better learning among students. Currently, the educational games have assumed an important place in children and teenagers lives. The role of educational games is important for improving the learning capability among the students especially of this generation, who really live among electronic gadgets. Hence, it is now important to make sure that in our educational system, we are updated with all such advancement in technologies. Already much research is going on in this area of edutainment. This research paper will review around ten different research papers to find the relation between the education learning techniques and games. The result of this review provides guidelines for enhanced teaching and learning solutions in education. In-house developed educational games proved to be more effective, compared to the one which is readily available in the market.

Keywords: education, education game, educational technology, edutainment, game genres, gaming in education

Procedia PDF Downloads 408
15081 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation

Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi

Abstract:

Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.

Keywords: integral production, level set method, morphological operation, segmentation

Procedia PDF Downloads 312
15080 Parametric Template-Based 3D Reconstruction of the Human Body

Authors: Jiahe Liu, Hongyang Yu, Feng Qian, Miao Luo, Linhang Zhu

Abstract:

This study proposed a 3D human body reconstruction method, which integrates multi-view joint information into a set of joints and processes it with a parametric human body template. Firstly, we obtained human body image information captured from multiple perspectives. The multi-view information can avoid self-occlusion and occlusion problems during the reconstruction process. Then, we used the MvP algorithm to integrate multi-view joint information into a set of joints. Next, we used the parametric human body template SMPL-X to obtain more accurate three-dimensional human body reconstruction results. Compared with the traditional single-view parametric human body template reconstruction, this method significantly improved the accuracy and stability of the reconstruction.

Keywords: parametric human body templates, reconstruction of the human body, multi-view, joint

Procedia PDF Downloads 73
15079 Modeling Route Selection Using Real-Time Information and GPS Data

Authors: William Albeiro Alvarez, Gloria Patricia Jaramillo, Ivan Reinaldo Sarmiento

Abstract:

Understanding the behavior of individuals and the different human factors that influence the choice when faced with a complex system such as transportation is one of the most complicated aspects of measuring in the components that constitute the modeling of route choice due to that various behaviors and driving mode directly or indirectly affect the choice. During the last two decades, with the development of information and communications technologies, new data collection techniques have emerged such as GPS, geolocation with mobile phones, apps for choosing the route between origin and destination, individual service transport applications among others, where an interest has been generated to improve discrete choice models when considering the incorporation of these developments as well as psychological factors that affect decision making. This paper implements a discrete choice model that proposes and estimates a hybrid model that integrates route choice models and latent variables based on the observation on the route of a sample of public taxi drivers from the city of Medellín, Colombia in relation to its behavior, personality, socioeconomic characteristics, and driving mode. The set of choice options includes the routes generated by the individual service transport applications versus the driver's choice. The hybrid model consists of measurement equations that relate latent variables with measurement indicators and utilities with choice indicators along with structural equations that link the observable characteristics of drivers with latent variables and explanatory variables with utilities.

Keywords: behavior choice model, human factors, hybrid model, real time data

Procedia PDF Downloads 146
15078 Performance Evaluation of One and Two Dimensional Prime Codes for Optical Code Division Multiple Access Systems

Authors: Gurjit Kaur, Neena Gupta

Abstract:

In this paper, we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension, i.e. time slots, whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research, we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for Optical Code Division Multiple Access (OCDMA) system on a single platform. Analysis shows that 2D prime code supports lesser number of active users than 1D codes, but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.

Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa

Procedia PDF Downloads 334