Search results for: low complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1664

Search results for: low complexity

1574 Approaches to Ethical Hacking: A Conceptual Framework for Research

Authors: Lauren Provost

Abstract:

The digital world remains increasingly vulnerable, making the development of effective cybersecurity approaches even more critical in supporting the success of the digital economy and national security. Although approaches to cybersecurity have shifted and improved in the last decade with new models, especially with cloud computing and mobility, a record number of high severity vulnerabilities were recorded in the National Institute of Standards and Technology (NIST), and its National Vulnerability Database (NVD) in 2020. This is due, in part, to the increasing complexity of cyber ecosystems. Security must be approached with a more comprehensive, multi-tool strategy that addresses the complexity of cyber ecosystems, including the human factor. Ethical hacking has emerged as such an approach: a more effective, multi-strategy, comprehensive approach to cyber security's most pressing needs, especially understanding the human factor. Research on ethical hacking, however, is limited in scope. The two main objectives of this work are to (1) provide highlights of case studies in ethical hacking, (2) provide a conceptual framework for research in ethical hacking that embraces and addresses both technical and nontechnical security measures. Recommendations include an improved conceptual framework for research centered on ethical hacking that addresses many factors and attributes of significant attacks that threaten computer security; a more robust, integrative multi-layered framework embracing the complexity of cybersecurity ecosystems.

Keywords: ethical hacking, literature review, penetration testing, social engineering

Procedia PDF Downloads 217
1573 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation

Authors: Oğuzhan Urhan

Abstract:

In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.

Keywords: fast motion estimation; low-complexity motion estimation, video coding

Procedia PDF Downloads 314
1572 Efficient Iterative V-BLAST Detection Technique in Wireless Communication System

Authors: Hwan-Jun Choi, Sung-Bok Choi, Hyoung-Kyu Song

Abstract:

Recently, among the MIMO-OFDM detection techniques, a lot of papers suggested V-BLAST scheme which can achieve high data rate. Therefore, the signal detection of MIMOOFDM system is important issue. In this paper, efficient iterative VBLAST detection technique is proposed in wireless communication system. The proposed scheme adjusts the number of candidate symbol and iterative scheme based on channel state. According to the simulation result, the proposed scheme has better BER performance than conventional schemes and similar BER performance of the QRD-M with iterative scheme. Moreover complexity of proposed scheme has 50.6 % less than complexity of QRD-M detection with iterative scheme. Therefore the proposed detection scheme can be efficiently used in wireless communication.

Keywords: MIMO-OFDM, V-BLAST, QR-decomposition, QRDM, DFE, iterative scheme, channel condition

Procedia PDF Downloads 528
1571 Interbank Networks and the Benefits of Using Multilayer Structures

Authors: Danielle Sandler dos Passos, Helder Coelho, Flávia Mori Sarti

Abstract:

Complexity science seeks the understanding of systems adopting diverse theories from various areas. Network analysis has been gaining space and credibility, namely with the biological, social and economic systems. Significant part of the literature focuses only monolayer representations of connections among agents considering one level of their relationships, and excludes other levels of interactions, leading to simplistic results in network analysis. Therefore, this work aims to demonstrate the advantages of the use of multilayer networks for the representation and analysis of networks. For this, we analyzed an interbank network, composed of 42 banks, comparing the centrality measures of the agents (degree and PageRank) resulting from each method (monolayer x multilayer). This proved to be the most reliable and efficient the multilayer analysis for the study of the current networks and highlighted JP Morgan and Deutsche Bank as the most important banks of the analyzed network.

Keywords: complexity, interbank networks, multilayer networks, network analysis

Procedia PDF Downloads 280
1570 Promoting Organizational Learning Facing the Complexity of Public Healthcare: How to Design a Voluntary, Learning-Oriented Benchmarking

Authors: Rachel M. Lørum, Henrik Eriksson, Frida Smith

Abstract:

Purpose: In recent years, the use of benchmarks for the improvement of healthcare has become increasingly common. There has been an increasing interest in why improvement initiatives so often fail to eliminate the problems they aspire to solve. Benchmarking comes with its fair share of challenges and problems, such as capturing the dynamics and complexities of the care environments, among others. In this study, we demonstrate how learning-oriented, voluntary benchmarks in the complex environment of public healthcare could be designed. Findings: Our four most important findings were the following: first, important organizational learning (OL) regarding the complexity of the service and implications on how to design a benchmark for learning and improvement occurred during the process. Second, participation by a wide range of professionals and stakeholders was crucial for capturing the complexity of people and organizations and increasing the quality of the template. Third, the continuous dialogue between all organizations involved was an important tool for ongoing organizational learning throughout the process. The last important finding was the impact of the facilitator’s role through supporting progress, coordination, and dialogue. Design: We chose participatory design as the research design. Data were derived from written materials such as e-mails, protocols, observational notes, and reflection notes collected during a period of 1.5 years. Originality: Our main contributions are the identification of important strategies, initiatives, and actors to involve when designing voluntary benchmarks for learning and improvement.

Keywords: organizational learning, quality improvement, learning-oriented benchmark, healthcare, patient safety

Procedia PDF Downloads 111
1569 Enhancement of Performance Utilizing Low Complexity Switched Beam Antenna

Authors: P. Chaipanya, R. Keawchai, W. Sombatsanongkhun, S. Jantaramporn

Abstract:

To manage the demand of wireless communication that has been dramatically increased, switched beam antenna in smart antenna system is focused. Implementation of switched beam antennas at mobile terminals such as notebook or mobile handset is a preferable choice to increase the performance of the wireless communication systems. This paper proposes the low complexity switched beam antenna using single element of antenna which is suitable to implement at mobile terminal. Main beam direction is switched by changing the positions of short circuit on the radiating patch. There are four cases of switching that provide four different directions of main beam. Moreover, the performance in terms of Signal to Interference Ratio when utilizing the proposed antenna is compared with the one using omni-directional antenna to confirm the performance improvable.

Keywords: switched beam, shorted circuit, single element, signal to interference ratio

Procedia PDF Downloads 170
1568 Defining a Holistic Approach for Model-Based System Engineering: Paradigm and Modeling Requirements

Authors: Hycham Aboutaleb, Bruno Monsuez

Abstract:

Current systems complexity has reached a degree that requires addressing conception and design issues while taking into account all the necessary aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponential growing effort, cost and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework and a environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and defines the refined functional as well as non functional requirements modeling tools needs to meet to be useful in model-based system engineering.

Keywords: system modeling, modeling language, modeling requirements, framework

Procedia PDF Downloads 529
1567 Web-Based Paperless Campus: An Approach to Reduce the Cost and Complexity of Education Administration

Authors: Yekini N. Asafe, Haastrup A. Victor, Lawal N. Olawale, Okikiola F. Mercy

Abstract:

Recent increase in access to personal computer and networking systems have made it feasible to perform much of cumbersome and costly paper-based administration in all organization. Desktop computers, networking systems, high capacity storage devices and telecommunications system is currently allowing the transfer of various format of data to be processed, stored and dissemination for the purpose of decision making. Going paperless is more of benefits compare to full paper-based office. This paper proposed a model for design and implementation of e-administration system (paperless campus) for an institution of learning. If this model is design and implemented it will reduced cost and complexity of educational administration also eliminate menaces and environmental hazards attributed to paper-based administration within schools and colleges.

Keywords: e-administration, educational administration, paperless campus, paper-based administration

Procedia PDF Downloads 374
1566 Hypergraph Models of Metabolism

Authors: Nicole Pearcy, Jonathan J. Crofts, Nadia Chuzhanova

Abstract:

In this paper, we employ a directed hypergraph model to investigate the extent to which environmental variability influences the set of available biochemical reactions within a living cell. Such an approach avoids the limitations of the usual complex network formalism by allowing for the multilateral relationships (i.e. connections involving more than two nodes) that naturally occur within many biological processes. More specifically, we extend the concept of network reciprocity to complex hyper-networks, thus enabling us to characterize a network in terms of the existence of mutual hyper-connections, which may be considered a proxy for metabolic network complexity. To demonstrate these ideas, we study 115 metabolic hyper-networks of bacteria, each of which can be classified into one of 6 increasingly varied habitats. In particular, we found that reciprocity increases significantly with increased environmental variability, supporting the view that organism adaptability leads to increased complexities in the resultant biochemical networks.

Keywords: complexity, hypergraphs, reciprocity, metabolism

Procedia PDF Downloads 296
1565 Design, Development by Functional Analysis in UML and Static Test of a Multimedia Voice and Video Communication Platform on IP for a Use Adapted to the Context of Local Businesses in Lubumbashi

Authors: Blaise Fyama, Elie Museng, Grace Mukoma

Abstract:

In this article we present a java implementation of video telephony using the SIP protocol (Session Initiation Protocol). After a functional analysis of the SIP protocol, we relied on the work of Italian researchers of University of Parma-Italy to acquire adequate libraries for the development of our own communication tool. In order to optimize the code and improve the prototype, we used, in an incremental approach, test techniques based on a static analysis based on the evaluation of the complexity of the software with the application of metrics and the number cyclomatic of Mccabe. The objective is to promote the emergence of local start-ups producing IP video in a well understood local context. We have arrived at the creation of a video telephony tool whose code is optimized.

Keywords: static analysis, coding complexity metric mccabe, Sip, uml

Procedia PDF Downloads 117
1564 Stochastic Control of Decentralized Singularly Perturbed Systems

Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan

Abstract:

Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.

Keywords: decentralized, optimal control, output, singular perturb

Procedia PDF Downloads 367
1563 Bio-Inspired Information Complexity Management: From Ant Colony to Construction Firm

Authors: Hamza Saeed, Khurram Iqbal Ahmad Khan

Abstract:

Effective information management is crucial for any construction project and its success. Primary areas of information generation are either the construction site or the design office. There are different types of information required at different stages of construction involving various stakeholders creating complexity. There is a need for effective management of information flows to reduce uncertainty creating complexity. Nature provides a unique perspective in terms of dealing with complexity, in particular, information complexity. System dynamics methodology provides tools and techniques to address complexity. It involves modeling and simulation techniques that help address complexity. Nature has been dealing with complex systems since its creation 4.5 billion years ago. It has perfected its system by evolution, resilience towards sudden changes, and extinction of unadaptable and outdated species that are no longer fit for the environment. Nature has been accommodating the changing factors and handling complexity forever. Humans have started to look at their natural counterparts for inspiration and solutions for their problems. This brings forth the possibility of using a biomimetics approach to improve the management practices used in the construction sector. Ants inhabit different habitats. Cataglyphis and Pogonomyrmex live in deserts, Leafcutter ants reside in rainforests, and Pharaoh ants are native to urban developments of tropical areas. Detailed studies have been done on fifty species out of fourteen thousand discovered. They provide the opportunity to study the interactions in diverse environments to generate collective behavior. Animals evolve to better adapt to their environment. The collective behavior of ants emerges from feedback through interactions among individuals, based on a combination of three basic factors: The patchiness of resources in time and space, operating cost, environmental stability, and the threat of rupture. If resources appear in patches through time and space, the response is accelerating and non-linear, and if resources are scattered, the response follows a linear pattern. If the acquisition of energy through food is faster than energy spent to get it, the default is to continue with an activity unless it is halted for some reason. If the energy spent is rather higher than getting it, the default changes to stay put unless activated. Finally, if the environment is stable and the threat of rupture is low, the activation and amplification rate is slow but steady. Otherwise, it is fast and sporadic. To further study the effects and to eliminate the environmental bias, the behavior of four different ant species were studied, namely Red Harvester ants (Pogonomyrmex Barbatus), Argentine ants (Linepithema Humile), Turtle ants (Cephalotes Goniodontus), Leafcutter ants (Genus: Atta). This study aims to improve the information system in the construction sector by providing a guideline inspired by nature with a systems-thinking approach, using system dynamics as a tool. Identified factors and their interdependencies were analyzed in the form of a causal loop diagram (CLD), and construction industry professionals were interviewed based on the developed CLD, which was validated with significance response. These factors and interdependencies in the natural system corresponds with the man-made systems, providing a guideline for effective use and flow of information.

Keywords: biomimetics, complex systems, construction management, information management, system dynamics

Procedia PDF Downloads 136
1562 The Effect of Visual Fluency and Cognitive Fluency on Access Rates of Web Pages

Authors: Xiaoying Guo, Xiangyun Wang

Abstract:

Access rates is a key indicator of reflecting the popularity of web pages. Having high access rates are very important for web pages, especially for news web pages, online shopping sites and searching engines. In this paper, we analyzed the influences of visual fluency and cognitive fluency on access rates of Chinese web pages. Firstly, we conducted an experiment of scoring the web pages. Twenty-five subjects were invited to view top 50 web pages of China, and they were asked to give a score in a 5-point Likert-scale from four aspects, including complexity, comfortability, familiarity and usability. Secondly, the obtained results was analyzed by correlation analysis and factor analysis in R. By factor analysis; we analyzed the contributions of visual fluency and cognitive fluency to the access rates. The results showed that both visual fluency and cognitive fluency affect the access rate of web pages. Compared to cognitive fluency, visual fluency play a more important role in user’s accessing of web pages.

Keywords: visual fluency, cognitive fluency, visual complexity, usability

Procedia PDF Downloads 376
1561 Economical Dependency Evolution and Complexity

Authors: Allé Dieng, Mamadou Bousso, Latif Dramani

Abstract:

The purpose of this work is to show the complexity behind economical interrelations in a country and provide a linear dynamic model of economical dependency evolution in a country. The model is based on National Transfer Account which is one of the most robust methodology developed in order to measure a level of demographic dividend captured in a country. It is built upon three major factors: demography, economical dependency and migration. The established mathematical model has been simulated using Netlogo software. The innovation of this study is in describing economical dependency as a complex system and simulating using mathematical equation the evolution of the two populations: the economical dependent and the non-economical dependent as defined in the National Transfer Account methodology. It also allows us to see the interactions and behaviors of both populations. The model can track individual characteristics and look at the effect of birth and death rates on the evolution of these two populations. The developed model is useful to understand how demographic and economic phenomenon are related

Keywords: ABM, demographic dividend, National Transfer Accounts (NTA), ODE

Procedia PDF Downloads 203
1560 Empirical Exploration of Correlations between Software Design Measures: A Replication Study

Authors: Jehad Al Dallal

Abstract:

Software engineers apply different measures to quantify the quality of software design. These measures consider artifacts developed at low or high level software design phases. The results are used to point to design weaknesses and to indicate design points that have to be restructured. Understanding the relationship among the quality measures and among the design quality aspects considered by these measures is important to interpreting the impact of a measure for a quality aspect on other potentially related aspects. In addition, exploring the relationship between quality measures helps to explain the impact of different quality measures on external quality aspects, such as reliability and maintainability. In this paper, we report a replication study that empirically explores the correlation between six well known and commonly applied design quality measures. These measures consider several quality aspects, including complexity, cohesion, coupling, and inheritance. The results indicate that inheritance measures are weakly correlated to other measures, whereas complexity, coupling, and cohesion measures are mostly strongly correlated.  

Keywords: quality attribute, quality measure, software design quality, Spearman correlation

Procedia PDF Downloads 296
1559 A Phenomenological Study of Sports for the Analysis of Soccer Game: On Embodiment of the Goal Type Ball Games of Team Sports

Authors: K. Kiniwa, S. Kitagawa, M. Kawamoto, H. Uchiyama

Abstract:

This study aims to identify phenomenologically the embodiment of soccer in order to analyze soccer games. In this paper the authors focused on the embodiment of sports and the embodiment of the goal type ball games of team sports. The authors revealed that the embodiment of sports is represented by inverse proportional body. It is possible for this structure (body scheme) of intercorporeality of sports to be compared to the symbolic figure of Uroboros which is a monster connected the tails of two snakes. The embodiment of the goal type ball games of team sports has dependency on situation and complexity. In doing this, it revealed that soccer is sensitive and emotional sports.

Keywords: intercorporeality, structure, body scheme, Uroboros, inverse proportional body, dependency on situation, complexity

Procedia PDF Downloads 301
1558 Critically Sampled Hybrid Trigonometry Generalized Discrete Fourier Transform for Multistandard Receiver Platform

Authors: Temidayo Otunniyi

Abstract:

This paper presents a low computational channelization algorithm for the multi-standards platform using poly phase implementation of a critically sampled hybrid Trigonometry generalized Discrete Fourier Transform, (HGDFT). An HGDFT channelization algorithm exploits the orthogonality of two trigonometry Fourier functions, together with the properties of Quadrature Mirror Filter Bank (QMFB) and Exponential Modulated filter Bank (EMFB), respectively. HGDFT shows improvement in its implementation in terms of high reconfigurability, lower filter length, parallelism, and medium computational activities. Type 1 and type 111 poly phase structures are derived for real-valued HGDFT modulation. The design specifications are decimated critically and over-sampled for both single and multi standards receiver platforms. Evaluating the performance of oversampled single standard receiver channels, the HGDFT algorithm achieved 40% complexity reduction, compared to 34% and 38% reduction in the Discrete Fourier Transform (DFT) and tree quadrature mirror filter (TQMF) algorithm. The parallel generalized discrete Fourier transform (PGDFT) and recombined generalized discrete Fourier transform (RGDFT) had 41% complexity reduction and HGDFT had a 46% reduction in oversampling multi-standards mode. While in the critically sampled multi-standard receiver channels, HGDFT had complexity reduction of 70% while both PGDFT and RGDFT had a 34% reduction.

Keywords: software defined radio, channelization, critical sample rate, over-sample rate

Procedia PDF Downloads 143
1557 Enunciation on Complexities of Selected Tree Searching Algorithms

Authors: Parag Bhalchandra, S. D. Khamitkar

Abstract:

Searching trees is a most interesting application of Artificial Intelligence. Over the period of time, many innovative methods have been evolved to better search trees with respect to computational complexities. Tree searches are difficult to understand due to the exponential growth of possibilities when increasing the number of nodes or levels in the tree. Usually it is understood when we traverse down in the tree, traverse down to greater depth, in the search of a solution or a goal. However, this does not happen in reality as explicit enumeration is not a very efficient method and there are many algorithmic speedups that will find the optimal solution without the burden of evaluating all possible trees. It was a common question before all researchers where they often wonder what algorithms will yield the best and fastest result The intention of this paper is two folds, one to review selected tree search algorithms and search strategies that can be applied to a problem space and the second objective is to stimulate to implement recent developments in the complexity behavior of search strategies. The algorithms discussed here apply in general to both brute force and heuristic searches.

Keywords: trees search, asymptotic complexity, brute force, heuristics algorithms

Procedia PDF Downloads 302
1556 UWB Channel Estimation Using an Efficient Sub-Nyquist Sampling Scheme

Authors: Yaacoub Tina, Youssef Roua, Radoi Emanuel, Burel Gilles

Abstract:

Recently, low-complexity sub-Nyquist sampling schemes based on the Finite Rate of Innovation (FRI) theory have been introduced to sample parametric signals at minimum rates. The multichannel modulating waveforms (MCMW) is such an efficient scheme, where the received signal is mixed with an appropriate set of arbitrary waveforms, integrated and sampled at rates far below the Nyquist rate. In this paper, the MCMW scheme is adapted to the special case of ultra wideband (UWB) channel estimation, characterized by dense multipaths. First, an appropriate structure, which accounts for the bandpass spectrum feature of UWB signals, is defined. Then, a novel approach to decrease the number of processing channels and reduce the complexity of this sampling scheme is presented. Finally, the proposed concepts are validated by simulation results, obtained with real filters, in the framework of a coherent Rake receiver.

Keywords: coherent rake receiver, finite rate of innovation, sub-nyquist sampling, ultra wideband

Procedia PDF Downloads 255
1555 Fractal Analysis of Polyacrylamide-Graphene Oxide Composite Gels

Authors: Gülşen Akın Evingür, Önder Pekcan

Abstract:

The fractal analysis is a bridge between the microstructure and macroscopic properties of gels. Fractal structure is usually provided to define the complexity of crosslinked molecules. The complexity in gel systems is described by the fractal dimension (Df). In this study, polyacrylamide- graphene oxide (GO) composite gels were prepared by free radical crosslinking copolymerization. The fractal analysis of polyacrylamide- graphene oxide (GO) composite gels were analyzed in various GO contents during gelation and were investigated by using Fluorescence Technique. The analysis was applied to estimate Df s of the composite gels. Fractal dimension of the polymer composite gels were estimated based on the power law exponent values using scaling models. In addition, here we aimed to present the geometrical distribution of GO during gelation. And we observed that as gelation proceeded GO plates first organized themselves into 3D percolation cluster with Df=2.52, then goes to diffusion limited clusters with Df =1.4 and then lines up to Von Koch curve with random interval with Df=1.14. Here, our goal is to try to interpret the low conductivity and/or broad forbidden gap of GO doped PAAm gels, by the distribution of GO in the final form of the produced gel.

Keywords: composite gels, fluorescence, fractal, scaling

Procedia PDF Downloads 306
1554 An Approach for Multilayered Ecological Networks

Authors: N. F. F. Ebecken, G. C. Pereira

Abstract:

Although networks provide a powerful approach to the study of a wide variety of ecological systems, their formulation usually does not include various types of interactions, interactions that vary in space and time, and interconnected systems such as networks. The emerging field of 'multilayer networks' provides a natural framework for extending ecological systems analysis to include these multiple layers of complexity as it specifically allows for differentiation and modeling of intralayer and interlayer connectivity. The structure provides a set of concepts and tools that can be adapted and applied to the ecology, facilitating research in high dimensionality, heterogeneous systems in nature. Here, ecological multilayer networks are formally defined based on a review of prior and related approaches, illustrates their application and potential with existing data analyzes, and discusses limitations, challenges, and future applications. The integration of multilayer network theory into ecology offers a largely untapped potential to further address ecological complexity, to finally provide new theoretical and empirical insights into the architecture and dynamics of ecological systems.

Keywords: ecological networks, multilayered networks, sea ecology, Brazilian Coastal Area

Procedia PDF Downloads 154
1553 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm

Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh

Abstract:

This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.

Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio

Procedia PDF Downloads 73
1552 Smart Production Planning: The Case of Aluminium Foundry

Authors: Samira Alvandi

Abstract:

In the context of the circular economy, production planning aims to eliminate waste and emissions and maximize resource efficiency. Historically production planning is challenged through arrays of uncertainty and complexity arising from the interdependence and variability of products, processes, and systems. Manufacturers worldwide are facing new challenges in tackling various environmental issues such as climate change, resource depletion, and land degradation. In managing the inherited complexity and uncertainty and yet maintaining profitability, the manufacturing sector is in need of a holistic framework that supports energy efficiency and carbon emission reduction schemes. The proposed framework addresses the current challenges and integrates simulation modeling with optimization for finding optimal machine-job allocation to maximize throughput and total energy consumption while minimizing lead time. The aluminium refinery facility in western Sydney, Australia, is used as an exemplar to validate the proposed framework.

Keywords: smart production planning, simulation-optimisation, energy aware capacity planning, energy intensive industries

Procedia PDF Downloads 76
1551 Show Products or Show Endorsers: Immersive Visual Experience in Fashion Advertisements on Instagram

Authors: H. Haryati, A. Nor Azura

Abstract:

Over the turn of the century, the advertising landscape has evolved significantly, from print media to digital media. In line with the shift to the advanced science and technology dramatically shake the framework of societies Fifth Industrial Revolution (IR5.0), technological endeavors have increased exponentially, which influenced user interaction more inspiring through online advertising that intentionally leads to buying behavior. Users are more accustomed to interactive content that responds to their actions. Thus, immersive experience has transformed into a new engagement experience To centennials. The purpose of this paper is to investigate pleasure and arousal as the fundamental elements of consumer emotions and affective responses to marketing stimuli. A quasi-experiment procedure will be adopted in the research involving 40 undergraduate students in Nilai, Malaysia. This study employed a 2 (celebrity endorser vs. Social media influencer) X 2 (high and low visual complexity) factorial between-subjects design. Participants will be exposed to a printed version depicting a fashion product endorsed by a celebrity and social media influencers, presented in high and low levels of visual complexity. While the questionnaire will be Distributing during the lab test session is used to control their honesty, real feedback, and responses through the latest Instagram design and engagement. Therefore, the research aims to define the immersive experience on Instagram and the interaction between pleasure and arousal. An advertisement that evokes pleasure and arousal will be likely getting more attention from the target audience. This is one of the few studies comparing the endorses in Instagram advertising. Also, this research extends the existing knowledge about the immersive visual complexity in the context of social media advertising.

Keywords: immersive visual experience, instagram, pleasure, arousal

Procedia PDF Downloads 177
1550 Variations in Spatial Learning and Memory across Natural Populations of Zebrafish, Danio rerio

Authors: Tamal Roy, Anuradha Bhat

Abstract:

Cognitive abilities aid fishes in foraging, avoiding predators & locating mates. Factors like predation pressure & habitat complexity govern learning & memory in fishes. This study aims to compare spatial learning & memory across four natural populations of zebrafish. Zebrafish, a small cyprinid inhabits a diverse range of freshwater habitats & this makes it amenable to studies investigating role of native environment in spatial cognitive abilities. Four populations were collected across India from waterbodies with contrasting ecological conditions. Habitat complexity of the water-bodies was evaluated as a combination of channel substrate diversity and diversity of vegetation. Experiments were conducted on populations under controlled laboratory conditions. A square shaped spatial testing arena (maze) was constructed for testing the performance of adult zebrafish. The square tank consisted of an inner square shaped layer with the edges connected to the diagonal ends of the tank-walls by connections thereby forming four separate chambers. Each of the four chambers had a main door in the centre. Each chamber had three sections separated by two windows. A removable coloured window-pane (red, yellow, green or blue) identified each main door. A food reward associated with an artificial plant was always placed inside the left-hand section of the red-door chamber. The position of food-reward and plant within the red-door chamber was fixed. A test fish would have to explore the maze by taking turns and locate the food inside the right-side section of the red-door chamber. Fishes were sorted from each population stock and kept individually in separate containers for identification. At a time, a test fish was released into the arena and allowed 20 minutes to explore in order to find the food-reward. In this way, individual fishes were trained through the maze to locate the food reward for eight consecutive days. The position of red door, with the plant and the reward, was shuffled every day. Following training, an intermission of four days was given during which the fishes were not subjected to trials. Post-intermission, the fishes were re-tested on the 13th day following the same protocol for their ability to remember the learnt task. Exploratory tendencies and latency of individuals to explore on 1st day of training, performance time across trials, and number of mistakes made each day were recorded. Additionally, mechanism used by individuals to solve the maze each day was analyzed across populations. Fishes could be expected to use algorithm (sequence of turns) or associative cues in locating the food reward. Individuals of populations did not differ significantly in latencies and tendencies to explore. No relationship was found between exploration and learning across populations. High habitat-complexity populations had higher rates of learning & stronger memory while low habitat-complexity populations had lower rates of learning and much reduced abilities to remember. High habitat-complexity populations used associative cues more than algorithm for learning and remembering while low habitat-complexity populations used both equally. The study, therefore, helped understand the role of natural ecology in explaining variations in spatial learning abilities across populations.

Keywords: algorithm, associative cue, habitat complexity, population, spatial learning

Procedia PDF Downloads 284
1549 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought

Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan

Abstract:

Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.

Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin

Procedia PDF Downloads 62
1548 From Mobility to Complexity: French Language Use among Algerian Doctoral Postgraduates in Scotland

Authors: Hadjer Chellia

Abstract:

The study explores the phenomenon of second language use in a migratory setting and uses the case of Algerian international students in Scotland, United Kingdom. The linguistic history of Algeria reveals that French language has a high status among the Algerians’ verbal repertoires and Algerian English students consider it as a language of prestige. With mobility of some of these students towards Scotland -in the guise of internationalization of higher education, mobility and exchange programs, the transition was deemed to bring more complexity to their pre-migratory linguistic repertoires and resulted into their French language- being endangered and threatened by a potential shift to English. The study employed semi-structured interviews among six Ph.D. ethnically related students, and the main aim behind that is to explore their current experiences with regards to French language use and to provide an account of the factors which assist in shifting to English as a second language instead. The six participants identified in interviews were further invited to focus group sessions based on an in-group interaction fashion to discuss different topics using heritage languages. This latter was opted for as part of the methodology as a means to observe their real linguistic practice and to investigate the link between behaviors and previous perceptions. The findings detect a variety of social, individual and socio-psychological factors that would contribute in refining the concept of language shift among newly established émigré communities with short stay vis a vis the linguistic outcomes of immigrants with long stay, across generational basis that was –to some extent-the focus of previous research on language shift. The results further reveal a mismatch between students' perceptions and observed behaviors. The research is then largely relevant to international students’ sociolinguistic experience of study abroad.

Keywords: complexity, mobility, potential shift, sociolinguistic experience

Procedia PDF Downloads 164
1547 Complex Decision Rules in the Form of Decision Trees

Authors: Avinash S. Jagtap, Sharad D. Gore, Rajendra G. Gurao

Abstract:

Decision rules become more and more complex as the number of conditions increase. As a consequence, the complexity of the decision rule also influences the time complexity of computer implementation of such a rule. Consider, for example, a decision that depends on four conditions A, B, C and D. For simplicity, suppose each of these four conditions is binary. Even then the decision rule will consist of 16 lines, where each line will be of the form: If A and B and C and D, then action 1. If A and B and C but not D, then action 2 and so on. While executing this decision rule, each of the four conditions will be checked every time until all the four conditions in a line are satisfied. The minimum number of logical comparisons is 4 whereas the maximum number is 64. This paper proposes to present a complex decision rule in the form of a decision tree. A decision tree divides the cases into branches every time a condition is checked. In the form of a decision tree, every branching eliminates half of the cases that do not satisfy the related conditions. As a result, every branch of the decision tree involves only four logical comparisons and hence is significantly simpler than the corresponding complex decision rule. The conclusion of this paper is that every complex decision rule can be represented as a decision tree and the decision tree is mathematically equivalent but computationally much simpler than the original complex decision rule

Keywords: strategic, tactical, operational, adaptive, innovative

Procedia PDF Downloads 284
1546 Low Density Parity Check Codes

Authors: Kassoul Ilyes

Abstract:

The field of error correcting codes has been revolutionized by the introduction of iteratively decoded codes. Among these, LDPC codes are now a preferred solution thanks to their remarkable performance and low complexity. The binary version of LDPC codes showed even better performance, although it’s decoding introduced greater complexity. This thesis studies the performance of binary LDPC codes using simplified weighted decisions. Information is transported between a transmitter and a receiver by digital transmission systems, either by propagating over a radio channel or also by using a transmission medium such as the transmission line. The purpose of the transmission system is then to carry the information from the transmitter to the receiver as reliably as possible. These codes have not generated enough interest within the coding theory community. This forgetfulness will last until the introduction of Turbo-codes and the iterative principle. Then it was proposed to adopt Pearl's Belief Propagation (BP) algorithm for decoding these codes. Subsequently, Luby introduced irregular LDPC codes characterized by a parity check matrix. And finally, we study simplifications on binary LDPC codes. Thus, we propose a method to make the exact calculation of the APP simpler. This method leads to simplifying the implementation of the system.

Keywords: LDPC, parity check matrix, 5G, BER, SNR

Procedia PDF Downloads 153
1545 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook

Authors: Chien-Jen Liu, Shu Ching Yang

Abstract:

Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.

Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness

Procedia PDF Downloads 344