Search results for: complexity measurement
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4138

Search results for: complexity measurement

3898 Improved Mutual Inductance of Rogowski Coil Using Hexagonal Core

Authors: S. Al-Sowayan

Abstract:

Rogowski coils are increasingly used for measurement of AC and transient electric currents. Mostly used Rogowski coils now are with circular or rectangular cores. In order to increase the sensitivity of the measurement of Rogowski coil and perform smooth wire winding, this paper studies the effect of increasing the mutual inductance in order to increase the coil sensitivity by presenting the calculation and simulation of a Rogowski coil with equilateral hexagonal shaped core and comparing the resulted mutual inductance with commonly used core shapes.

Keywords: Rogowski coil, mutual inductance, magnetic flux density, communication engineering

Procedia PDF Downloads 339
3897 Readiness of Intellectual Capital Measurement: A Review of the Property Development and Investment Industry

Authors: Edward C. W. Chan, Benny C. F. Cheung

Abstract:

In the knowledge economy, the financial indicator is not the unique instrument to gauge the performance of a company. The role of intellectual capital contributing to the company performance is increasing. To measure the company performance due to intellectual capital, the value-added intellectual capital (VAIC) model is adopted to measure the intellectual capital utilisation efficiency of the subject companies. The purpose of this study is to review the readiness of measuring intellectual capital for the Hong Kong listed companies in the property development and property investment industry by using VAIC model. This study covers the financial reports from the representative Hong Kong listed property development companies and property investment companies in the period 2014-2019. The findings from this study indicated the industry is ready for IC measurement employing VAIC framework but not yet ready for using the extended VAIC model.

Keywords: intellectual capital, intellectual capital measurement, property development, property investment, Skandia navigator, VAIC

Procedia PDF Downloads 85
3896 Modified RSA in Mobile Communication

Authors: Nagaratna Rajur, J. D. Mallapur, Y. B. Kirankumar

Abstract:

The security in mobile communication is very different from the internet or telecommunication, because of its poor user interface and limited processing capacity, as well as combination of complex network protocols. Hence, it poses a challenge for less memory usage and low computation speed based security system. Security involves all the activities that are undertaken to protect the value and on-going usability of assets and the integrity and continuity of operations. An effective network security strategies requires identifying threats and then choosing the most effective set of tools to combat them. Cryptography is a simple and efficient way to provide security in communication. RSA is an asymmetric key approach that is highly reliable and widely used in internet communication. However, it has not been efficiently implemented in mobile communication due its computational complexity and large memory utilization. The proposed algorithm modifies the current RSA to be useful in mobile communication by reducing its computational complexity and memory utilization.

Keywords: M-RSA, sensor networks, sensor applications, security

Procedia PDF Downloads 317
3895 Challenges and Insights by Electrical Characterization of Large Area Graphene Layers

Authors: Marcus Klein, Martina GrießBach, Richard Kupke

Abstract:

The current advances in the research and manufacturing of large area graphene layers are promising towards the introduction of this exciting material in the display industry and other applications that benefit from excellent electrical and optical characteristics. New production technologies in the fabrication of flexible displays, touch screens or printed electronics apply graphene layers on non-metal substrates and bring new challenges to the required metrology. Traditional measurement concepts of layer thickness, sheet resistance, and layer uniformity, are difficult to apply to graphene production processes and are often harmful to the product layer. New non-contact sensor concepts are required to adapt to the challenges and even the foreseeable inline production of large area graphene. Dedicated non-contact measurement sensors are a pioneering method to leverage these issues in a large variety of applications, while significantly lowering the costs of development and process setup. Transferred and printed graphene layers can be characterized with high accuracy in a huge measurement range using a very high resolution. Large area graphene mappings are applied for process optimization and for efficient quality control for transfer, doping, annealing and stacking processes. Examples of doped, defected and excellent Graphene are presented as quality images and implications for manufacturers are explained.

Keywords: graphene, doping and defect testing, non-contact sheet resistance measurement, inline metrology

Procedia PDF Downloads 278
3894 Examining the Modular End of Line Control Unit Design Criteria for Vehicle Sliding Door System Slide Profile

Authors: Orhan Kurtuluş, Cüneyt Yavuz

Abstract:

The end of the line controls of the finished products in the automotive industry is important. The control that has been conducted with the manual methods for the sliding doors tracks is not sufficient and faulty products cannot be identified. As a result, the customer has the faulty products. In the scope of this study, the design criteria of the PLC integrated modular end of line control unit has been examined, designed and manufactured to make the control of the 10 different track profile to 2 different vehicles with an objective to minimize the salvage costs by obtaining more sensitive, certain and accurate measurement results. In the study that started with literature and patent review, the design inputs have been specified, the technical concept has been developed, computer supported mechanic design, control system and automation design, design review and design improvement have been made. Laser analog sensors at high sensitivity, probes and modular blocks have been used in the unit. The measurement has been conducted in the system and it is observed that measurement results are more sensitive than the previous methods.

Keywords: control unit design, end of line, modular design, sliding door system

Procedia PDF Downloads 406
3893 Five Pitfalls in Defining a Health System and Implications for Research and Management

Authors: Macdonald Kanyangale, Sandram Naluso

Abstract:

Globally, researchers have struggled over time to adequately define the notion of health system to inform research. This study is significant because it proposes an integrative framework for a robust definition of the health system. The objective of this article is to examine major pitfalls in definitions of health system used in prior literature and implications of these for research and management. The study used methodological steps of a scoping review proposed by Arksey and O'Malley to identify and examine 24 definitions of a health system in articles selected from six databases and web search engines. Thematic analysis was used to delineate and categorise definitional pitfalls into broader themes. There are a plethora of five major pitfalls in the extant definitions of a health system which may easily scupper any unsuspecting researcher if not avoided or addressed in research. These definitional pitfalls are reductionist assumptions which ignore dynamic and complex connections, overly wide boundary and lack of specification of levels in a health system, and limited focus on process in a health system. In addition, there is the tendency of treating different components of the health system as equal and simplifying of the ontological complexity of the health system. Future scholars are advised to avoid or address the identified five major pitfalls if they are to develop robust definitions of an HS. The use of an integrative framework for a robust definition of a health system is recommended, while implications of the pitfalls are discussed as a basis and catalyst for complexity-informed research and managing interactively.

Keywords: complexity management, health system, pitfalls, reductionism, research

Procedia PDF Downloads 105
3892 Creation and Validation of a Measurement Scale of E-Management: An Exploratory and Confirmatory Study

Authors: Hamadi Khlif

Abstract:

This paper deals with the understanding of the concept of e-management and the development of a measuring instrument adapted to the new problems encountered during the application of this new practice within the modern enterprise. Two principal e-management factors have been isolated in an exploratory study carried out among 260 participants. A confirmatory study applied to a second sample of 270 participants has been established in a cross-validation of the scale of measurement. The study presents the literature review specifically dedicated to e-management and the results of the exploratory and confirmatory phase of the development of this scale, which demonstrates satisfactory psychometric qualities. The e-management has two dimensions: a managerial dimension and a technological dimension.

Keywords: e-management, management, ICT deployment, mode of management

Procedia PDF Downloads 288
3891 Study of Variation of Winds Behavior on Micro Urban Environment with Use of Fuzzy Logic for Wind Power Generation: Case Study in the Cities of Arraial do Cabo and São Pedro da Aldeia, State of Rio de Janeiro, Brazil

Authors: Roberto Rosenhaim, Marcos Antonio Crus Moreira, Robson da Cunha, Gerson Gomes Cunha

Abstract:

This work provides details on the wind speed behavior within cities of Arraial do Cabo and São Pedro da Aldeia located in the Lakes Region of the State of Rio de Janeiro, Brazil. This region has one of the best potentials for wind power generation. In interurban layer, wind conditions are very complex and depend on physical geography, size and orientation of buildings and constructions around, population density, and land use. In the same context, the fundamental surface parameter that governs the production of flow turbulence in urban canyons is the surface roughness. Such factors can influence the potential for power generation from the wind within the cities. Moreover, the use of wind on a small scale is not fully utilized due to complexity of wind flow measurement inside the cities. It is difficult to accurately predict this type of resource. This study demonstrates how fuzzy logic can facilitate the assessment of the complexity of the wind potential inside the cities. It presents a decision support tool and its ability to deal with inaccurate information using linguistic variables created by the heuristic method. It relies on the already published studies about the variables that influence the wind speed in the urban environment. These variables were turned into the verbal expressions that are used in computer system, which facilitated the establishment of rules for fuzzy inference and integration with an application for smartphones used in the research. In the first part of the study, challenges of the sustainable development which are described are followed by incentive policies to the use of renewable energy in Brazil. The next chapter follows the study area characteristics and the concepts of fuzzy logic. Data were collected in field experiment by using qualitative and quantitative methods for assessment. As a result, a map of the various points is presented within the cities studied with its wind viability evaluated by a system of decision support using the method multivariate classification based on fuzzy logic.

Keywords: behavior of winds, wind power, fuzzy logic, sustainable development

Procedia PDF Downloads 256
3890 Difficulty and Complexity in Dealing with Visual Pollution in the Historical Cities: The Historical City of Ibb-Yemen as a Case Study

Authors: Abdulfattah A. Q .Alwah, Wen Li, Mohammed A. Q. Alwah, Duc Thien Tran, Bing Xi Liu

Abstract:

The historical cities in the third world suffer from many environmental problems; one of them is the spread of visual pollution manifestations. These phenomena increase with low levels of public awareness and low per capita income. The historical city of Ibb is suffering from a variety of visual pollution of the urban environment, so it has been chosen as a case study. This study aims to identify the difficulty and complexity of dealing with visual pollutions manifestations in the historical city of Ibb, and to provide appropriate solutions, which suit with the complex and contradictory circumstances. The study relies on an inductive approach to achieve its aims through two methods; the first is a visual survey of the visual pollution phenomenon based on images and researcher notes. The Second method is the analyses of the opinions and impressions of the city's residents and visitors through interviews, in addition to interviews with the officials in the competent authorities, and some specialists in the field of urban environment. Through the results of the field study and discussion of the interview results, this study presents an analysis of the phenomenon of visual distortion of the historical city of Ibb regarding the appearances and the reasons. Furthermore, this study provides appropriate solutions, which suitable with the complex and contradictory circumstances. These solutions take two paths: the first one is to stop the spread of visual distortions, and the second path is to address the current visual pollutions.

Keywords: visual pollution, visual image, urban environment, difficulty, complexity, historical cities, the historical city of Ibb

Procedia PDF Downloads 114
3889 Efficiency Measurement of Turkish via the Stochastic Frontier Model

Authors: Yeliz Mert Kantar, İsmail Yeni̇lmez, Ibrahim Arik

Abstract:

In this study, the efficiency measurement of the top fifty Turkish Universities has been conducted. The top fifty Turkish Universities are listed by The Scientific and Technological Research Council of Turkey (TÜBITAK) according to the Entrepreneur and Innovative University Index every year. The index is calculated based on four components since 2018. Four components are scientific and technological research competency, intellectual property pool, cooperation and interaction, and economic and social contribution. The four components consist of twenty-three sub-components. The 2021 list announced in January 2022 is discussed in this study. Efficiency analysis have been carried out using the Stochastic Frontier Model. Statistical significance of the sub-components that make up the index with certain weights has been examined in terms of the efficiency measurement calculated through the Stochastic Frontier Model. The relationship between the efficiency ranking estimated based on the Stochastic Frontier Model and the Entrepreneur and Innovative University Index ranking is discussed in detail.

Keywords: efficiency, entrepreneur and innovative universities, turkish universities, stochastic frontier model, tübi̇tak

Procedia PDF Downloads 63
3888 Analysis of Tandem Detonator Algorithm Optimized by Quantum Algorithm

Authors: Tomasz Robert Kuczerski

Abstract:

The high complexity of the algorithm of the autonomous tandem detonator system creates an optimization problem due to the parallel operation of several machine states of the system. Many years of experience and classic analyses have led to a partially optimized model. Limitations on the energy resources of this class of autonomous systems make it necessary to search for more effective methods of optimisation. The use of the Quantum Approximate Optimization Algorithm (QAOA) in these studies shows the most promising results. With the help of multiple evaluations of several qubit quantum circuits, proper results of variable parameter optimization were obtained. In addition, it was observed that the increase in the number of assessments does not result in further efficient growth due to the increasing complexity of optimising variables. The tests confirmed the effectiveness of the QAOA optimization method.

Keywords: algorithm analysis, autonomous system, quantum optimization, tandem detonator

Procedia PDF Downloads 59
3887 Blood Glucose Measurement and Analysis: Methodology

Authors: I. M. Abd Rahim, H. Abdul Rahim, R. Ghazali

Abstract:

There is numerous non-invasive blood glucose measurement technique developed by researchers, and near infrared (NIR) is the potential technique nowadays. However, there are some disagreements on the optimal wavelength range that is suitable to be used as the reference of the glucose substance in the blood. This paper focuses on the experimental data collection technique and also the analysis method used to analyze the data gained from the experiment. The selection of suitable linear and non-linear model structure is essential in prediction system, as the system developed need to be conceivably accurate.

Keywords: linear, near-infrared (NIR), non-invasive, non-linear, prediction system

Procedia PDF Downloads 428
3886 Improved Impossible Differential Cryptanalysis of Midori64

Authors: Zhan Chen, Wenquan Bi, Xiaoyun Wang

Abstract:

The Midori family of light weight block cipher is proposed in ASIACRYPT2015. It has attracted the attention of numerous cryptanalysts. There are two versions of Midori: Midori64 which takes a 64-bit block size and Midori128 the size of which is 128-bit. In this paper an improved 10-round impossible differential attack on Midori64 is proposed. Pre-whitening keys are considered in this attack. A better impossible differential path is used to reduce time complexity by decreasing the number of key bits guessed. A hash table is built in the pre-computation phase to reduce computational complexity. Partial abort technique is used in the key seiving phase. The attack requires 259 chosen plaintexts, 214.58 blocks of memory and 268.83 10-round Midori64 encryptions.

Keywords: cryptanalysis, impossible differential, light weight block cipher, Midori

Procedia PDF Downloads 325
3885 Measurement of Reverse Flow Generated at Cold Exit of Vortex Tube

Authors: Mohd Hazwan bin Yusof, Hiroshi Katanoda

Abstract:

In order to clarify the structure of the cold flow discharged from the vortex tube (VT), the pressure of the cold flow was measured, and a simple flow visualization technique using a 0.75 mm-diameter needle and an oily paint is made to study the reverse flow at the cold exit. It is clear that a negative pressure and positive pressure region exist at a certain pressure and cold fraction area, and that a reverse flow is observed in the negative pressure region.

Keywords: flow visualization, pressure measurement, reverse flow, vortex tube

Procedia PDF Downloads 486
3884 Low Complexity Deblocking Algorithm

Authors: Jagroop Singh Sidhu, Buta Singh

Abstract:

A low computational deblocking filter including three frequency related modes (smooth mode, intermediate mode, and non-smooth mode for low-frequency, mid-frequency, and high frequency regions, respectively) is proposed. The suggested approach requires zero additions, zero subtractions, zero multiplications (for intermediate region), no divisions (for non-smooth region) and no comparison. The suggested method thus keeps the computation lower and thus suitable for image coding systems based on blocks. Comparison of average number of operations for smooth, non-smooth, intermediate (per pixel vector for each block) using filter suggested by Chen and the proposed method filter suggests that the proposed filter keeps the computation lower and is thus suitable for fast processing algorithms.

Keywords: blocking artifacts, computational complexity, non-smooth, intermediate, smooth

Procedia PDF Downloads 436
3883 Measurement Errors and Misclassifications in Covariates in Logistic Regression: Bayesian Adjustment of Main and Interaction Effects and the Sample Size Implications

Authors: Shahadut Hossain

Abstract:

Measurement errors in continuous covariates and/or misclassifications in categorical covariates are common in epidemiological studies. Regression analysis ignoring such mismeasurements seriously biases the estimated main and interaction effects of covariates on the outcome of interest. Thus, adjustments for such mismeasurements are necessary. In this research, we propose a Bayesian parametric framework for eliminating deleterious impacts of covariate mismeasurements in logistic regression. The proposed adjustment method is unified and thus can be applied to any generalized linear and non-linear regression models. Furthermore, adjustment for covariate mismeasurements requires validation data usually in the form of either gold standard measurements or replicates of the mismeasured covariates on a subset of the study population. Initial investigation shows that adequacy of such adjustment depends on the sizes of main and validation samples, especially when prevalences of the categorical covariates are low. Thus, we investigate the impact of main and validation sample sizes on the adjusted estimates, and provide a general guideline about these sample sizes based on simulation studies.

Keywords: measurement errors, misclassification, mismeasurement, validation sample, Bayesian adjustment

Procedia PDF Downloads 384
3882 Vortex Separator for More Accurate Air Dry-Bulb Temperature Measurement

Authors: Ahmed N. Shmroukh, I. M. S. Taha, A. M. Abdel-Ghany, M. Attalla

Abstract:

Fog systems application for cooling and humidification is still limited, although these systems require less initial cost compared with that of other cooling systems such as pad-and-fan systems. The undesirable relative humidity and air temperature inside the space which have been cooled or humidified are the main reasons for its limited use, which results from the poor control of fog systems. Any accurate control system essentially needs air dry bulb temperature as an input parameter. Therefore, the air dry-bulb temperature in the space needs to be measured accurately. The Scope of the present work is the separation of the fog droplets from the air in a fogged space to measure the air dry bulb temperature accurately. The separation is to be done in a small device inside which the sensor of the temperature measuring instrument is positioned. Vortex separator will be designed and used. Another reference device will be used for measuring the air temperature without separation. A comparative study will be performed to reach at the best device which leads to the most accurate measurement of air dry bulb temperature. The results showed that the proposed devices improved the measured air dry bulb temperature toward the correct direction over that of the free junction. Vortex device was the best. It respectively increased the temperature measured by the free junction in the range from around 2 to around 6°C for different fog on-off duration.

Keywords: fog systems, measuring air dry bulb temperature, temperature measurement, vortex separator

Procedia PDF Downloads 264
3881 Counting People Utilizing Space-Time Imagery

Authors: Ahmed Elmarhomy, K. Terada

Abstract:

An automated method for counting passerby has been proposed using virtual-vertical measurement lines. Space-time image is representing the human regions which are treated using the segmentation process. Different color space has been used to perform the template matching. A proper template matching has been achieved to determine direction and speed of passing people. Distinguish one or two passersby has been investigated using a correlation between passerby speed and the human-pixel area. Finally, the effectiveness of the presented method has been experimentally verified.

Keywords: counting people, measurement line, space-time image, segmentation, template matching

Procedia PDF Downloads 424
3880 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 461
3879 Near Infrared Spectrometry to Determine the Quality of Milk, Experimental Design Setup and Chemometrics: Review

Authors: Meghana Shankara, Priyadarshini Natarajan

Abstract:

Infrared (IR) spectroscopy has revolutionized the way we look at materials around us. Unraveling the pattern in the molecular spectra of materials to analyze the composition and properties of it has been one of the most interesting challenges in modern science. Applications of the IR spectrometry are numerous in the field’s pharmaceuticals, health, food and nutrition, oils, agriculture, construction, polymers, beverage, fabrics and much more limited only by the curiosity of the people. Near Infrared (NIR) spectrometry is applied robustly in analyzing the solids and liquid substances because of its non-destructive analysis method. In this paper, we have reviewed the application of NIR spectrometry in milk quality analysis and have presented the modes of measurement applied in NIRS measurement setup, Design of Experiment (DoE), classification/quantification algorithms used in the case of milk composition prediction like Fat%, Protein%, Lactose%, Solids Not Fat (SNF%) along with different approaches for adulterant identification. We have also discussed the important NIR ranges for the chosen milk parameters. The performance metrics used in the comparison of the various Chemometric approaches include Root Mean Square Error (RMSE), R^2, slope, offset, sensitivity, specificity and accuracy

Keywords: chemometrics, design of experiment, milk quality analysis, NIRS measurement modes

Procedia PDF Downloads 240
3878 Comparison of Monte Carlo Simulations and Experimental Results for the Measurement of Complex DNA Damage Induced by Ionizing Radiations of Different Quality

Authors: Ifigeneia V. Mavragani, Zacharenia Nikitaki, George Kalantzis, George Iliakis, Alexandros G. Georgakilas

Abstract:

Complex DNA damage consisting of a combination of DNA lesions, such as Double Strand Breaks (DSBs) and non-DSB base lesions occurring in a small volume is considered as one of the most important biological endpoints regarding ionizing radiation (IR) exposure. Strong theoretical (Monte Carlo simulations) and experimental evidence suggests an increment of the complexity of DNA damage and therefore repair resistance with increasing linear energy transfer (LET). Experimental detection of complex (clustered) DNA damage is often associated with technical deficiencies limiting its measurement, especially in cellular or tissue systems. Our groups have recently made significant improvements towards the identification of key parameters relating to the efficient detection of complex DSBs and non-DSBs in human cellular systems exposed to IR of varying quality (γ-, X-rays 0.3-1 keV/μm, α-particles 116 keV/μm and 36Ar ions 270 keV/μm). The induction and processing of DSB and non-DSB-oxidative clusters were measured using adaptations of immunofluorescence (γH2AX or 53PB1 foci staining as DSB probes and human repair enzymes OGG1 or APE1 as probes for oxidized purines and abasic sites respectively). In the current study, Relative Biological Effectiveness (RBE) values for DSB and non-DSB induction have been measured in different human normal (FEP18-11-T1) and cancerous cell lines (MCF7, HepG2, A549, MO59K/J). The experimental results are compared to simulation data obtained using a validated microdosimetric fast Monte Carlo DNA Damage Simulation code (MCDS). Moreover, this simulation approach is implemented in two realistic clinical cases, i.e. prostate cancer treatment using X-rays generated by a linear accelerator and a pediatric osteosarcoma case using a 200.6 MeV proton pencil beam. RBE values for complex DNA damage induction are calculated for the tumor areas. These results reveal a disparity between theory and experiment and underline the necessity for implementing highly precise and more efficient experimental and simulation approaches.

Keywords: complex DNA damage, DNA damage simulation, protons, radiotherapy

Procedia PDF Downloads 284
3877 Approaches to Ethical Hacking: A Conceptual Framework for Research

Authors: Lauren Provost

Abstract:

The digital world remains increasingly vulnerable, making the development of effective cybersecurity approaches even more critical in supporting the success of the digital economy and national security. Although approaches to cybersecurity have shifted and improved in the last decade with new models, especially with cloud computing and mobility, a record number of high severity vulnerabilities were recorded in the National Institute of Standards and Technology (NIST), and its National Vulnerability Database (NVD) in 2020. This is due, in part, to the increasing complexity of cyber ecosystems. Security must be approached with a more comprehensive, multi-tool strategy that addresses the complexity of cyber ecosystems, including the human factor. Ethical hacking has emerged as such an approach: a more effective, multi-strategy, comprehensive approach to cyber security's most pressing needs, especially understanding the human factor. Research on ethical hacking, however, is limited in scope. The two main objectives of this work are to (1) provide highlights of case studies in ethical hacking, (2) provide a conceptual framework for research in ethical hacking that embraces and addresses both technical and nontechnical security measures. Recommendations include an improved conceptual framework for research centered on ethical hacking that addresses many factors and attributes of significant attacks that threaten computer security; a more robust, integrative multi-layered framework embracing the complexity of cybersecurity ecosystems.

Keywords: ethical hacking, literature review, penetration testing, social engineering

Procedia PDF Downloads 183
3876 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation

Authors: Oğuzhan Urhan

Abstract:

In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.

Keywords: fast motion estimation; low-complexity motion estimation, video coding

Procedia PDF Downloads 292
3875 Evaluating Contextually Targeted Advertising with Attention Measurement

Authors: John Hawkins, Graham Burton

Abstract:

Contextual targeting is a common strategy for advertising that places marketing messages in media locations that are expected to be aligned with the target audience. There are multiple major challenges to contextual targeting: the ideal categorisation scheme needs to be known, as well as the most appropriate subsections of that scheme for a given campaign or creative. In addition, the campaign reach is typically limited when targeting becomes narrow, so a balance must be struck between requirements. Finally, refinement of the process is limited by the use of evaluation methods that are either rapid but non-specific (click through rates), or reliable but slow and costly (conversions or brand recall studies). In this study we evaluate the use of attention measurement as a technique for understanding the performance of targeting on the basis of specific contextual topics. We perform the analysis using a large scale dataset of impressions categorised using the iAB V2.0 taxonomy. We evaluate multiple levels of the categorisation hierarchy, using categories at different positions within an initial creative specific ranking. The results illustrate that measuring attention time is an affective signal for the performance of a specific creative within a specific context. Performance is sustained across a ranking of categories from one period to another.

Keywords: contextual targeting, digital advertising, attention measurement, marketing performance

Procedia PDF Downloads 79
3874 Development of an Atmospheric Radioxenon Detection System for Nuclear Explosion Monitoring

Authors: V. Thomas, O. Delaune, W. Hennig, S. Hoover

Abstract:

Measurement of radioactive isotopes of atmospheric xenon is used to detect, locate and identify any confined nuclear tests as part of the Comprehensive Nuclear Test-Ban Treaty (CTBT). In this context, the Alternative Energies and French Atomic Energy Commission (CEA) has developed a fixed device to continuously measure the concentration of these fission products, the SPALAX process. During its atmospheric transport, the radioactive xenon will undergo a significant dilution between the source point and the measurement station. Regarding the distance between fixed stations located all over the globe, the typical volume activities measured are near 1 mBq m⁻³. To avoid the constraints induced by atmospheric dilution, the development of a mobile detection system is in progress; this system will allow on-site measurements in order to confirm or infringe a suspicious measurement detected by a fixed station. Furthermore, this system will use beta/gamma coincidence measurement technique in order to drastically reduce environmental background (which masks such activities). The detector prototype consists of a gas cell surrounded by two large silicon wafers, coupled with two square NaI(Tl) detectors. The gas cell has a sample volume of 30 cm³ and the silicon wafers are 500 µm thick with an active surface area of 3600 mm². In order to minimize leakage current, each wafer has been segmented into four independent silicon pixels. This cell is sandwiched between two low background NaI(Tl) detectors (70x70x40 mm³ crystal). The expected Minimal Detectable Concentration (MDC) for each radio-xenon is in the order of 1-10 mBq m⁻³. Three 4-channels digital acquisition modules (Pixie-NET) are used to process all the signals. Time synchronization is ensured by a dedicated PTP-network, using the IEEE 1588 Precision Time Protocol. We would like to present this system from its simulation to the laboratory tests.

Keywords: beta/gamma coincidence technique, low level measurement, radioxenon, silicon pixels

Procedia PDF Downloads 104
3873 Efficient Iterative V-BLAST Detection Technique in Wireless Communication System

Authors: Hwan-Jun Choi, Sung-Bok Choi, Hyoung-Kyu Song

Abstract:

Recently, among the MIMO-OFDM detection techniques, a lot of papers suggested V-BLAST scheme which can achieve high data rate. Therefore, the signal detection of MIMOOFDM system is important issue. In this paper, efficient iterative VBLAST detection technique is proposed in wireless communication system. The proposed scheme adjusts the number of candidate symbol and iterative scheme based on channel state. According to the simulation result, the proposed scheme has better BER performance than conventional schemes and similar BER performance of the QRD-M with iterative scheme. Moreover complexity of proposed scheme has 50.6 % less than complexity of QRD-M detection with iterative scheme. Therefore the proposed detection scheme can be efficiently used in wireless communication.

Keywords: MIMO-OFDM, V-BLAST, QR-decomposition, QRDM, DFE, iterative scheme, channel condition

Procedia PDF Downloads 502
3872 Interbank Networks and the Benefits of Using Multilayer Structures

Authors: Danielle Sandler dos Passos, Helder Coelho, Flávia Mori Sarti

Abstract:

Complexity science seeks the understanding of systems adopting diverse theories from various areas. Network analysis has been gaining space and credibility, namely with the biological, social and economic systems. Significant part of the literature focuses only monolayer representations of connections among agents considering one level of their relationships, and excludes other levels of interactions, leading to simplistic results in network analysis. Therefore, this work aims to demonstrate the advantages of the use of multilayer networks for the representation and analysis of networks. For this, we analyzed an interbank network, composed of 42 banks, comparing the centrality measures of the agents (degree and PageRank) resulting from each method (monolayer x multilayer). This proved to be the most reliable and efficient the multilayer analysis for the study of the current networks and highlighted JP Morgan and Deutsche Bank as the most important banks of the analyzed network.

Keywords: complexity, interbank networks, multilayer networks, network analysis

Procedia PDF Downloads 244
3871 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion

Authors: Doyoung Kim, Hyo Seon Park

Abstract:

Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.

Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification

Procedia PDF Downloads 381
3870 Labour Productivity Measurement and Control Standards for Hotels

Authors: Kristine Joy Simpao

Abstract:

Improving labour productivity is one of the most enthralling and challenging aspects of managing hotels and restaurant business. The demand to secure countless productivity became an increasingly pivotal role of managers to survive and sustain the business. Besides making business profitable, they are in the doom to make every resource to become productive and effective towards achieving company goal while maximizing the value of organization. This paper examines what productivity means to the services industry, in particular, to the hotel industry. This is underpinned by an investigation of the extent of practice of respondent hotels to the labour productivity aspect in the areas of materials management, human resource management and leadership management and in a way, computing the labour productivity ratios using the hotel simple ratios of productivity in order to find a suitable measurement and control standards for hotels with SBMA, Olongapo City as the locale of the study. The finding shows that hotels labour productivity ratings are not perfect with some practices that are far below particularly on strategic and operational decisions in improving performance and productivity of its human resources. It further proves of the no significant difference ratings among the respondent’s type in all areas which indicated that they are having similar perception of the weak implementation of some of the indicators in the labour productivity practices. Furthermore, the results in the computation of labour productivity efficiency ratios resulted relationship of employees versus labour productivity practices are inversely proportional. This study provides a potential measurement and control standards for the enhancement of hotels labour productivity. These standards should also contain labour productivity customized for standard hotels in Subic Bay Freeport Zone to assist hotel owners in increasing the labour productivity while meeting company goals and objectives effectively.

Keywords: labour productivity, hotel, measurement and control, standards, efficiency ratios, practices

Procedia PDF Downloads 290
3869 Nutrition Strategy Using Traditional Tibetan Medicine in the Preventive Measurement

Authors: Ngawang Tsering

Abstract:

Traditional Tibetan medicine is primarily focused on promoting health and keeping away diseases from its unique in prescribing specific diet and lifestyle. The prevalence of chronic diseases has been rising day by day and kills a number of people due to the lack of proper nutritional design in modern times. According to traditional Tibetan medicine, chronic diseases such as diabetes, cancer, cardiovascular diseases, respiratory diseases, and arthritis are heavily associated with an unwholesome diet and inappropriate lifestyles. Diet and lifestyles are the two main conditions of diseases and healthy life. The prevalence of chronic diseases is one of the challenges, with massive economic impact and expensive health issues. Though chronic diseases are challenges, it has a solution in the preventive measurements by using proper nutrition design based on traditional Tibetan medicine. Until today, it is hard to evaluate whether traditional Tibetan medicine nutrition strategy could play a major role in preventive measurement as of the lack of current research evidence. However, compared with modern nutrition, it has an exclusive valuable concept, such as a holistic way and diet or nutrition recommendation based on different aspects. Traditional Tibetan medicine is one of the oldest ancient existing medical systems known as Sowa Rigpa (Science of Healing) highlights different aspects of dietetics and nutrition, namely geographical, seasonal, age, personality, emotional, food combination, the process of individual metabolism, potency, and amount of food. This article offers a critical perspective on the preventive measurement against chronic diseases through nutrition design using traditional Tibetan medicine and also needs attention for a deeper understanding of traditional Tibetan medicine in the modern world.

Keywords: traditional Tibetan medicine, nutrition, chronic diseases, preventive measurement, holistic approach, integrative

Procedia PDF Downloads 128