Search results for: computational complexity theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7596

Search results for: computational complexity theory

7506 Subarray Based Multiuser Massive MIMO Design Adopting Large Transmit and Receive Arrays

Authors: Tetsiki Taniguchi, Yoshio Karasawa

Abstract:

This paper describes a subarray based low computational design method of multiuser massive multiple input multiple output (MIMO) system. In our previous works, use of large array is assumed only in transmitter, but this study considers the case both of transmitter and receiver sides are equipped with large array antennas. For this aim, receive arrays are also divided into several subarrays, and the former proposed method is modified for the synthesis of a large array from subarrays in both ends. Through computer simulations, it is verified that the performance of the proposed method is degraded compared with the original approach, but it can achieve the improvement in the aspect of complexity, namely, significant reduction of the computational load to the practical level.

Keywords: large array, massive multiple input multiple output (MIMO), multiuser, singular value decomposition, subarray, zero forcing

Procedia PDF Downloads 374
7505 Low Density Parity Check Codes

Authors: Kassoul Ilyes

Abstract:

The field of error correcting codes has been revolutionized by the introduction of iteratively decoded codes. Among these, LDPC codes are now a preferred solution thanks to their remarkable performance and low complexity. The binary version of LDPC codes showed even better performance, although it’s decoding introduced greater complexity. This thesis studies the performance of binary LDPC codes using simplified weighted decisions. Information is transported between a transmitter and a receiver by digital transmission systems, either by propagating over a radio channel or also by using a transmission medium such as the transmission line. The purpose of the transmission system is then to carry the information from the transmitter to the receiver as reliably as possible. These codes have not generated enough interest within the coding theory community. This forgetfulness will last until the introduction of Turbo-codes and the iterative principle. Then it was proposed to adopt Pearl's Belief Propagation (BP) algorithm for decoding these codes. Subsequently, Luby introduced irregular LDPC codes characterized by a parity check matrix. And finally, we study simplifications on binary LDPC codes. Thus, we propose a method to make the exact calculation of the APP simpler. This method leads to simplifying the implementation of the system.

Keywords: LDPC, parity check matrix, 5G, BER, SNR

Procedia PDF Downloads 128
7504 Theoretical and Computational Investigation of PCBM and PC71BM Derivatives using the DFT Method

Authors: Zair Mohammed El Amine, Chemouri Hafida, Derbal Habak Hassina

Abstract:

Organic photovoltaic cells are electronic devices that convert sunlight into electricity. To this end, the number of studies on organic photovoltaic cells (OVCs) is growing, and this trend is expected to continue. Computational studies are still needed to verify and prove the capability of CVOs, specifically the nanometer molecule PCBM, based on successful experimental results. In this paper, we present a theoretical and computational investigation of PCBM and PC71BM derivatives using the DFT method. On this basis, we employ independent and time-dependent density theories. HOMO, LUMO and GAPH-L energies, ionization potentials and electronic affinity are determined and found to be in agreement with experiments. Using DFT theory based on B3LYP and M062X methods with bases 6-31G (d,p) and 6-311G (d), calculations show that the most efficient acceptors are presented in the group of PC71BM derivatives and are in substantial agreement with experiments. The geometries of the structures are optimized by Gaussian 09.

Keywords: PCBM, P3HT, organic cell solar, DFT, TD-DFT

Procedia PDF Downloads 46
7503 Computational Experiment on Evolution of E-Business Service Ecosystem

Authors: Xue Xiao, Sun Hao, Liu Donghua

Abstract:

E-commerce is experiencing rapid development and evolution, but traditional research methods are difficult to fully demonstrate the relationship between micro factors and macro evolution in the development process of e-commerce, which cannot provide accurate assessment for the existing strategies and predict the future evolution trends. To solve these problems, this paper presents the concept of e-commerce service ecosystem based on the characteristics of e-commerce and business ecosystem theory, describes e-commerce environment as a complex adaptive system from the perspective of ecology, constructs a e-commerce service ecosystem model by using Agent-based modeling method and Java language in RePast simulation platform and conduct experiment through the way of computational experiment, attempt to provide a suitable and effective researching method for the research on e-commerce evolution. By two experiments, it can be found that system model built in this paper is able to show the evolution process of e-commerce service ecosystem and the relationship between micro factors and macro emergence. Therefore, the system model constructed by Agent-based method and computational experiment provides proper means to study the evolution of e-commerce ecosystem.

Keywords: e-commerce service ecosystem, complex system, agent-based modeling, computational experiment

Procedia PDF Downloads 315
7502 Static vs. Stream Mining Trajectories Similarity Measures

Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh

Abstract:

Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.

Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining

Procedia PDF Downloads 370
7501 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 104
7500 Application of a Hybrid Modified Blade Element Momentum Theory/Computational Fluid Dynamics Approach for Wine Turbine Aerodynamic Performances Prediction

Authors: Samah Laalej, Abdelfattah Bouatem

Abstract:

In the field of wind turbine blades, it is complicated to evaluate the aerodynamic performances through experimental measurements as it requires a lot of computing time and resources. Therefore, in this paper, a hybrid BEM-CFD numerical technique is developed to predict power and aerodynamic forces acting on the blades. Computational fluid dynamics (CFD) simulation was conducted to calculate the drag and lift forces through Ansys software using the K-w model. Then an enhanced BEM code was created to predict the power outputs generated by the wind turbine using the aerodynamic properties extracted from the CFD approach. The numerical approach was compared and validated with experimental data. The power curves calculated from this hybrid method were in good agreement with experimental measurements for all velocity ranges.

Keywords: blade element momentum, aerodynamic forces, wind turbine blades, computational fluid dynamics approach

Procedia PDF Downloads 23
7499 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information

Procedia PDF Downloads 773
7498 Studying Relationship between Local Geometry of Decision Boundary with Network Complexity for Robustness Analysis with Adversarial Perturbations

Authors: Tushar K. Routh

Abstract:

If inputs are engineered in certain manners, they can influence deep neural networks’ (DNN) performances by facilitating misclassifications, a phenomenon well-known as adversarial attacks that question networks’ vulnerability. Recent studies have unfolded the relationship between vulnerability of such networks with their complexity. In this paper, the distinctive influence of additional convolutional layers at the decision boundaries of several DNN architectures was investigated. Here, to engineer inputs from widely known image datasets like MNIST, Fashion MNIST, and Cifar 10, we have exercised One Step Spectral Attack (OSSA) and Fast Gradient Method (FGM) techniques. The aftermaths of adding layers to the robustness of the architectures have been analyzed. For reasoning, separation width from linear class partitions and local geometry (curvature) near the decision boundary have been examined. The result reveals that model complexity has significant roles in adjusting relative distances from margins, as well as the local features of decision boundaries, which impact robustness.

Keywords: DNN robustness, decision boundary, local curvature, network complexity

Procedia PDF Downloads 43
7497 Uncovering the Complex Structure of Building Design Process Based on Royal Institute of British Architects Plan of Work

Authors: Fawaz A. Binsarra, Halim Boussabaine

Abstract:

The notion of complexity science has been attracting the interest of researchers and professionals due to the need of enhancing the efficiency of understanding complex systems dynamic and structure of interactions. In addition, complexity analysis has been used as an approach to investigate complex systems that contains a large number of components interacts with each other to accomplish specific outcomes and emerges specific behavior. The design process is considered as a complex action that involves large number interacted components, which are ranked as design tasks, design team, and the components of the design process. Those three main aspects of the building design process consist of several components that interact with each other as a dynamic system with complex information flow. In this paper, the goal is to uncover the complex structure of information interactions in building design process. The Investigating of Royal Institute of British Architects Plan Of Work 2013 information interactions as a case study to uncover the structure and building design process complexity using network analysis software to model the information interaction will significantly enhance the efficiency of the building design process outcomes.

Keywords: complexity, process, building desgin, Riba, design complexity, network, network analysis

Procedia PDF Downloads 488
7496 Inverse Matrix in the Theory of Dynamical Systems

Authors: Renata Masarova, Bohuslava Juhasova, Martin Juhas, Zuzana Sutova

Abstract:

In dynamic system theory a mathematical model is often used to describe their properties. In order to find a transfer matrix of a dynamic system we need to calculate an inverse matrix. The paper contains the fusion of the classical theory and the procedures used in the theory of automated control for calculating the inverse matrix. The final part of the paper models the given problem by the Matlab.

Keywords: dynamic system, transfer matrix, inverse matrix, modeling

Procedia PDF Downloads 482
7495 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs

Authors: Malo Pocheau-Lesteven, Olivier Le Maître

Abstract:

Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.

Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program

Procedia PDF Downloads 125
7494 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 556
7493 The Estimation of Human Vital Signs Complexity

Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius

Abstract:

Non-stationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables interactions.

Keywords: cardiac diseases, complex systems theory, ECG analysis, matrix analysis

Procedia PDF Downloads 314
7492 Determination of Complexity Level in Merged Irregular Transposition Cipher

Authors: Okike Benjamin, Garba Ejd

Abstract:

Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.

Keywords: transposition cipher, merged irregular cipher, encryption, complexity level

Procedia PDF Downloads 317
7491 The Capability of Organizational Leadership: Development of Conceptual Framework

Authors: Kurmet Kivipõld, Maaja Vadi

Abstract:

Current paper develops the conceptual framework for organizational leadership capability. Organizational leadership here is understood as collective multi-level phenomenon which has been embedded into organizational processes as a capability at the level of the entire organization. The paper analyses and systematises the theo¬retical approaches to multi-level leadership in existing literature. This analysis marks the foundation of collective leadership at the organizational level, which forms the basis for the development of the conceptual framework of organi¬zational leadership capability. The developed conceptual framework of organiza¬tional leadership capability is formed from the synthesis of the three groups of base theories – traditional leadership theories, the resource-based view from strategic management and complexity theory from system theories. These conceptual sources present the main characteristics that determine the nature of organizational leadership capability and are the basis for its mea¬surement.

Keywords: leadership, organizational capability, organizational leadership, resource-based view, system theory

Procedia PDF Downloads 321
7490 Chinese Fantasy Novel: New Word Teaching for Non-Native Learners

Authors: Bok Check Meng, Goh Ying Soon

Abstract:

Giving additional learning materials such as Chinese fantasy novel to non-native learners can be strenuous. Instructors have to understand the underpinning theories about cognitive theory for new word instruction. This paper discusses the underpinning theories. Relevant literature reviews are given. There are basically five major areas of cognitive related theories mentioned in this article. These include motivational learning theory, Affective theory of learning, Cognitive psychology theory, Vocabulary acquisition theory and Bloom’s cognitive levels theory. A theoretical framework has been constructed. Thus, this will give a hand in ensuring non-native learners might gain positive outcomes in the instruction process. Instructors who are interested in teaching new word from Chinese fantasy novel in specific to support additional learning might be able to get insights from this article.

Keywords: Chinese fantasy novel, new word teaching, non-native learners, cognitive theory, bloom

Procedia PDF Downloads 702
7489 Contextualizing Theory Z of Motivation Among Indian Universities of Higher Education

Authors: Janani V., Tanika Singh, Bala Subramanian R., Santosh Kumar Sharma

Abstract:

Higher education across the globe is undergoing a sea change. This has created a varied management of higher education in Indian universities, and therefore, we find no universal law regarding HR policies and practices in these universities. As a result, faculty retention is very low, which is a serious concern for educational leaders such as vice-chancellors or directors working in the higher education sector. We can understand this phenomenon in the light of various management theories, among which theory z proposed by William Ouchi is a prominent one. With this backdrop, the present article strives to contextualize theory z in Indian higher education. For the said purpose, qualitative methodology has been adopted, and accordingly, propositions have been generated. We believe that this article will motivate other researchers to empirically test the generated propositions and thereby contribute in the existing literature.

Keywords: education, managemenet, motivation, Theory X, Theory Y, Theory Z, faculty members, universities, India

Procedia PDF Downloads 51
7488 Logic of the Prospect Theory: The Decision Making Process of the First Gulf War and the Crimean Annexation

Authors: Zhengyang Ma, Zhiyao Li, Jiayi Zhang

Abstract:

This article examines the prospect theory’s arguments about decision-making through two case studies, the First Gulf War and Russia’s annexation of Crimea. The article uses the methods of comparative case analysis and process tracing to investigate the prospect theory’s fundamental arguments. Through evidence derived from existing primary and secondary sources, this paper argues that both former U.S. President Bush and Russian President Putin viewed their situations as a domain of loss and made risky decisions to prevent further deterioration, which attests the arguments of the prospect theory. After the two case studies, this article also discusses how the prospect theory could be used in analyzing the decision-making process that led to the current Russia-Ukraine War.

Keywords: the prospect theory, international relations, the first gulf war, the crimea crisis

Procedia PDF Downloads 85
7487 Disintegration of Deuterons by Photons Reaction Model for GEANT4 with Dibaryon Formalism

Authors: Jae Won Shin, Chang Ho Hyun

Abstract:

A disintegration of deuterons by photons (dγ → np) reaction model for GEANT4 is developed in this work. An effective field theory with dibaryon fields Introducing a dibaryon field, we can take into account the effective range contribution to the propagator up to infinite order, and it consequently makes the convergence of the theory better than the pionless effective field theory without dibaryon fields. We develop a hadronic model for GEANT4 which is specialized for the disintegration of the deuteron by photons, dγ → np. For the description of two-nucleon interactions, we employ an effective field theory so called pionless theory with dibaryon fields (dEFT). In spite of its simplicity, the theory has proven very effective and useful in the applications to various two-nucleon systems and processes at low energies. We apply the new model of GEANT4 (G4dEFT) to the calculation of total and differential cross sections in dγ → np, and obtain good agreements to experimental data for a wide range of incoming photon energies.

Keywords: dγ → np, dibaryon fields, effective field theory, GEANT4

Procedia PDF Downloads 346
7486 Modal Density Influence on Modal Complexity Quantification in Dynamic Systems

Authors: Fabrizio Iezzi, Claudio Valente

Abstract:

The viscous damping in dynamic systems can be proportional or non-proportional. In the first case, the mode shapes are real whereas in the second case they are complex. From an engineering point of view, the complexity of the mode shapes is important in order to quantify the non-proportional damping. Different indices exist to provide estimates of the modal complexity. These indices are or not zero, depending whether the mode shapes are not or are complex. The modal density problem arises in the experimental identification when the dynamic systems have close modal frequencies. Depending on the entity of this closeness, the mode shapes can hold fictitious imaginary quantities that affect the values of the modal complexity indices. The results are the failing in the identification of the real or complex mode shapes and then of the proportional or non-proportional damping. The paper aims to show the influence of the modal density on the values of these indices in case of both proportional and non-proportional damping. Theoretical and pseudo-experimental solutions are compared to analyze the problem according to an appropriate mechanical system.

Keywords: complex mode shapes, dynamic systems identification, modal density, non-proportional damping

Procedia PDF Downloads 359
7485 Green Energy, Fiscal Incentives and Conflicting Signals: Analysing the Challenges Faced in Promoting on Farm Waste to Energy Projects

Authors: Hafez Abdo, Rob Ackrill

Abstract:

Renewable energy (RE) promotion in the UK relies on multiple policy instruments, which are required to overcome the path dependency pressures favouring fossil fuels. These instruments include targeted funding schemes and economy-wide instruments embedded in the tax code. The resulting complexity of incentives raises important questions around the coherence and effectiveness of these instruments for RE generation. This complexity is exacerbated by UK RE policy being nested within EU policy in a multi-level governance (MLG) setting. To gain analytical traction on such complexity, this study will analyse policies promoting the on-farm generation of energy for heat and power, from farm and food waste, via anaerobic digestion. Utilising both primary and secondary data, it seeks to address a particular lacuna in the academic literature. Via a localised, in-depth investigation into the complexity of policy instruments promoting RE, this study will help our theoretical understanding of the challenges that MLG and path dependency pressures present to policymakers of multi-dimensional policies.

Keywords: anaerobic digestion, energy, green, policy, renewable, tax, UK

Procedia PDF Downloads 343
7484 Architecture of a Preliminary Course on Computational Thinking

Authors: Mintu Philip, Renumol V. G.

Abstract:

An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.

Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking

Procedia PDF Downloads 417
7483 Determination of Complexity Level in Okike's Merged Irregular Transposition Cipher

Authors: Okike Benjami, Garba Ejd

Abstract:

Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In other to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often decrypted by adversaries with ease. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Okike’s Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.

Keywords: transposition cipher, merged irregular cipher, encryption, complexity level

Procedia PDF Downloads 263
7482 An Improved Approach to Solve Two-Level Hierarchical Time Minimization Transportation Problem

Authors: Kalpana Dahiya

Abstract:

This paper discusses a two-level hierarchical time minimization transportation problem, which is an important class of transportation problems arising in industries. This problem has been studied by various researchers, and a number of polynomial time iterative algorithms are available to find its solution. All the existing algorithms, though efficient, have some shortcomings. The current study proposes an alternate solution algorithm for the problem that is more efficient in terms of computational time than the existing algorithms. The results justifying the underlying theory of the proposed algorithm are given. Further, a detailed comparison of the computational behaviour of all the algorithms for randomly generated instances of this problem of different sizes validates the efficiency of the proposed algorithm.

Keywords: global optimization, hierarchical optimization, transportation problem, concave minimization

Procedia PDF Downloads 119
7481 A Study of Chinese-specific Terms in Government Work Report(2017-2019) from the Perspective of Relevance Theory

Authors: Shi Jiaxin

Abstract:

The Government Work Report is an essential form of document in the government of the People’s Republic of China. It covers all aspects of Chinese society and reflects China’s development strategy and trend. There are countless special terms in Government Work Report. Only by understanding Chinese-specific terms can we understand the content of the Government Work Report. Only by accurately translating the Chinese-specific terms can people come from all across the world know the Chinese government work report and understand China. Relevance theory is a popular theory of cognitive pragmatics. Relevance Translation Theory, which is closely related to Relevance Theory, has crucial and major guiding significance for the translation of Chinese-specific. Through studying Relevance Theory and researching the translation techniques, strategies and applications in the process of translating Chinese-specific terms from the perspective of Relevance Theory, we can understand the meaning and connotation of Chinese-specific terms, then solve various problems in the process of C-E translation, and strengthen our translation ability.

Keywords: government work report, Chinese-specific terms, relevance theory, translation

Procedia PDF Downloads 127
7480 Efficient Signal Detection Using QRD-M Based on Channel Condition in MIMO-OFDM System

Authors: Jae-Jeong Kim, Ki-Ro Kim, Hyoung-Kyu Song

Abstract:

In this paper, we propose an efficient signal detector that switches M parameter of QRD-M detection scheme is proposed for MIMO-OFDM system. The proposed detection scheme calculates the threshold by 1-norm condition number and then switches M parameter of QRD-M detection scheme according to channel information. If channel condition is bad, the parameter M is set to high value to increase the accuracy of detection. If channel condition is good, the parameter M is set to low value to reduce complexity of detection. Therefore, the proposed detection scheme has better trade off between BER performance and complexity than the conventional detection scheme. The simulation result shows that the complexity of proposed detection scheme is lower than QRD-M detection scheme with similar BER performance.

Keywords: MIMO-OFDM, QRD-M, channel condition, BER

Procedia PDF Downloads 332
7479 A Computational Study of the Effect of Intake Design on Volumetric Efficiency for Best Performance in Motorsport

Authors: Dominic Wentworth-Linton, Shian Gao

Abstract:

This project was aimed at investigating the effect of velocity stacks on the intakes of internal combustion engines for motorsport applications. The intake systems in motorsport are predominantly fuel injection with a plate mounted for the stacks. Using Computational Fluid Dynamics software, the relationship between the stack length and power and torque delivery across the engine’s rev range was investigated and the results were used to choose the best option for its intended motorsport discipline. The test results are expected to vary with engine geometry and its natural manufacturer characteristics. The test was also relevant in bridging between computational data and real simulation as the results show flow, pressure and velocity readings but the behaviour of the engine is inferred from the nature of each test. The results of the data analysis were tested in a real-life simulation on a dynamometer to prove the theory of stack length on power and torque delivery, which helps determine the most suitable stack for the Vauxhall engine for rallying in the Caribbean.

Keywords: CFD simulation, Internal combustion engine, Intake system, Dynamometer test

Procedia PDF Downloads 258
7478 Value from Environmental and Cultural Perspectives or Two Sides of the Same Coin

Authors: Vilem Paril, Dominika Tothova

Abstract:

This paper discusses the value theory in cultural heritage and the value theory in environmental economics. Two economic views of the value theory are compared within the field of cultural heritage maintenance and within the field of the environment. The main aims are to find common features in these two differently structured theories under the layer of differently defined terms as well as really differing features of these two approaches, to clear the confusion which stems from different terminology as in fact these terms capture the same aspects of reality and to show possible inspiration these two perspectives can offer one another. Another aim is to present these two value systems in one value framework. First, important moments of the value theory from the economic perspective are presented, leading to the marginal revolution of (not only) the Austrian School. Then the theory of value within cultural heritage and environmental economics are explored. Finally, individual approaches are compared and their potential mutual inspiration searched for.

Keywords: cultural heritage, environmental economics, existence value, value theory

Procedia PDF Downloads 290
7477 Molecular Electron Density Theory Study on the Mechanism and Selectivity of the 1,3 Dipolar Cycloaddition Reaction of N-Methyl-C-(2-Furyl) Nitrone with Activated Alkenes

Authors: Moulay Driss Mellaoui, Abdallah Imjjad, Rachid Boutiddar, Haydar Mohammad-Salim, Nivedita Acharjee, Hassan Bourzi, Souad El Issami, Khalid Abbiche, Hanane Zejli

Abstract:

We have investigated the underlying molecular processes involved in the [3+2] cycloaddition (32CA) reactions between N-methyl-C-(2-furyl) nitrone and three acetylene derivatives: 4b, 5b, and 6b. For this investigation, we utilized molecular electron density theory (MEDT) and density functional theory (DFT) methods at the B3LYP-D3/6 31G (d) computational level. These 32CA reactions, which exhibit a zwitterionic (zw-type) nature, proceed through a one-step mechanism with activation enthalpies ranging from 8.80 to 14.37 kcal mol−1 in acetonitrile and ethanol solvents. When the nitrone reacts with phenyl methyl propiolate (4b), two regioisomeric pathways lead to the formation of two products: P1,5-4b and P1,4-4b. On the other hand, when the nitrone reacts with dimethyl acetylene dicarboxylate (5b) and acetylene dicarboxylic acid (but-2-ynedioic acid) (6b), it results in the formation of a single product. Through topological analysis, we can categorize the nitrone as a zwitterionic three-atom component (TAC). Furthermore, the analysis of conceptual density functional theory (CDFT) indices classifies the 32CA reactions of the nitrone with 4b, 5b, and 6b as forward electron density flux (FEDF) reactions. The study of bond evolution theory (BET) reveals that the formation of new C-C and C-O covalent bonds does not initiate in the transition states, as the intermediate stages of these reactions display pseudoradical centers of the atoms already involved in bonding.

Keywords: 4-isoxazoline, DFT/B3LYP-D3, regioselectivity, cycloaddition reaction, MEDT, ELF

Procedia PDF Downloads 133