Search results for: computing paradigm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1755

Search results for: computing paradigm

1395 The Return of the Witches: A Class That Motivates the Analysis of Gender Bias in Engineer

Authors: Veronica Botero, Karen Ortiz

Abstract:

The Faculty of Mines, of the National University of Colombia, Medellín Campus, is a faculty that has 136 years of history and represents one of the most important study centers in the country in the field of engineering and scientific research, as well as a reference at a global, national, and Latin American level in this matter. Despite being a faculty with so many years of history and having trained a large number of graduates under the traditional mechanistic and androcentric paradigm, which reproduces the logic of the traditional scientific method and the differentiated and severe look between subject-object of research among other binarisms, has also been the place where professors and students have become aware of the need to transform this paradigm into engineering, and focus on the sustainability of diversity and the well-being of the natural and social systems that inhabit the territories and has opened possibilities for the implementation of classes that address feminist pedagogical theories and practices. The class: The return of the witches, is an initiative that constitutes an important training exercise that provides students with the study of feminisms, the importance of closing gender gaps and critical readings on the traditional paradigm of engineering. The objective of this article is to present a systematization of the experience of design, implementation and development of this elective class, describing the tensions that arose at the time when a subject of this style was created and proposed in the Department of Geosciences and Environment, from the Faculty of Mines in 2022; the reactions of the groups of students who have taken it and their perceptions and opinions about ecofeminism as proposals for critical analysis and practices in relation to the environment and, above all, how their readings of the world have changed after having studied this subject for a semester. The pedagogical journey and the feminist methodologies that have been designed and adjusted over two years of work will be explained based on the sharing of situated knowledge of the students and the two teachers who teach the course, who pose challenges to the dominant ideology in engineering since one of them is trained in human sciences and feminist studies and the other, although trained in civil engineering and geosciences, is a woman with diverse sexual orientation and is the first professor to have assumed the position of dean in the 135 years of history of the Faculty. The transformations in the life experience of the students are revealing since they affirm that the training process is forceful and powerful to outline a much more qualified and critical professional profile that contributes to the transformation of gender gaps in the country. This class is therefore a challenge in this Faculty of Engineering that still presents a dominant ideology on gender that has not been questioned or challenged before.

Keywords: feminisms, gender equality, gender bias, engineering for life Manifiesto.

Procedia PDF Downloads 41
1394 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.

Keywords: big data, evolutionary computing, cloud, precision technologies

Procedia PDF Downloads 165
1393 Digital Homeostasis: Tangible Computing as a Multi-Sensory Installation

Authors: Andrea Macruz

Abstract:

This paper explores computation as a process for design by examining how computers can become more than an operative strategy in a designer's toolkit. It documents this, building upon concepts of neuroscience and Antonio Damasio's Homeostasis Theory, which is the control of bodily states through feedback intended to keep conditions favorable for life. To do this, it follows a methodology through algorithmic drawing and discusses the outcomes of three multi-sensory design installations, which culminated from a course in an academic setting. It explains both the studio process that took place to create the installations and the computational process that was developed, related to the fields of algorithmic design and tangible computing. It discusses how designers can use computational range to achieve homeostasis related to sensory data in a multi-sensory installation. The outcomes show clearly how people and computers interact with different sensory modalities and affordances. They propose using computers as meta-physical stabilizers rather than tools.

Keywords: algorithmic drawing, Antonio Damasio, emotion, homeostasis, multi-sensory installation, neuroscience

Procedia PDF Downloads 82
1392 Design Off-Campus Interactive Cloud-Based Learning Model

Authors: Osamah Al Qadoori

Abstract:

Using cloud computing in educational sectors grow rapidly in UAE. Initially, within Cloud-Learning Environment Students whenever and wherever can remotely join the online-classroom, on the other hand, Cloud-Based Learning is greatly decreasing the infrastructure and the maintenance cost. Nowadays in many schools (K-12), institutes, colleges as well as universities in UAE Cloud-Based Teaching and Learning environments gain a higher demand and concern. Many students don’t use the available online-educational resources effectively. The challenging question is to which extend these educational resources which are installed in the cloud environment are valuable and constructive? In this paper the researcher is seeking to design an expert agent prototype where the huge information being accommodated inside the cloud environment will go through expert filtration before going to be utilized by other clients (students). To achieve this goal, the focus of the present research would be on two different directions the educational human expertise and the automated-educational expert systems.

Keywords: cloud computing, cloud-learning environment, online-classroom, the educational human expertise, the automated-educational expert systems

Procedia PDF Downloads 522
1391 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 454
1390 Variants of Mathematical Induction as Strong Proof Techniques in Theory of Computing

Authors: Ahmed Tarek, Ahmed Alveed

Abstract:

In the theory of computing, there are a wide variety of direct and indirect proof techniques. However, mathematical induction (MI) stands out to be one of the most powerful proof techniques for proving hypotheses, theorems, and new results. There are variations of mathematical induction-based proof techniques, which are broadly classified into three categories, such as structural induction (SI), weak induction (WI), and strong induction (SI). In this expository paper, several different variants of the mathematical induction techniques are explored, and the specific scenarios are discussed where a specific induction technique stands out to be more advantageous as compared to other induction strategies. Also, the essential difference among the variants of mathematical induction are explored. The points of separation among mathematical induction, recursion, and logical deduction are precisely analyzed, and the relationship among variations of recurrence relations, and mathematical induction are being explored. In this context, the application of recurrence relations, and mathematical inductions are considered together in a single framework for codewords over a given alphabet.

Keywords: alphabet, codeword, deduction, mathematical, induction, recurrence relation, strong induction, structural induction, weak induction

Procedia PDF Downloads 142
1389 Investigating the Impact of Task Demand and Duration on Passage of Time Judgements and Duration Estimates

Authors: Jesika A. Walker, Mohammed Aswad, Guy Lacroix, Denis Cousineau

Abstract:

There is a fundamental disconnect between the experience of time passing and the chronometric units by which time is quantified. Specifically, there appears to be no relationship between the passage of time judgments (PoTJs) and verbal duration estimates at short durations (e.g., < 2000 milliseconds). When a duration is longer than several minutes, however, evidence suggests that a slower feeling of time passing is predictive of overestimation. Might the length of a task moderate the relation between PoTJs and duration estimates? Similarly, the estimation paradigm (prospective vs. retrospective) and the mental effort demanded of a task (task demand) have both been found to influence duration estimates. However, only a handful of experiments have investigated these effects for tasks of long durations, and the results have been mixed. Thus, might the length of a task also moderate the effects of the estimation paradigm and task demand on duration estimates? To investigate these questions, 273 participants performed either an easy or difficult visual and memory search task for either eight or 58 minutes, under prospective or retrospective instructions. Afterward, participants provided a duration estimate in minutes, followed by a PoTJ on a Likert scale (1 = very slow, 7 = very fast). A 2 (prospective vs. retrospective) × 2 (eight minutes vs. 58 minutes) × 2 (high vs. low difficulty) between-subjects ANOVA revealed a two-way interaction between task demand and task duration on PoTJs, p = .02. Specifically, time felt faster in the more challenging task, but only in the eight-minute condition, p < .01. Duration estimates were transformed into RATIOs (estimate/actual duration) to standardize estimates across durations. An ANOVA revealed a two-way interaction between estimation paradigm and task duration, p = .03. Specifically, participants overestimated the task more if they were given prospective instructions, but only in the eight-minute task. Surprisingly, there was no effect of task difficulty on duration estimates. Thus, the demands of a task may influence ‘feeling of time’ and ‘estimation time’ differently, contributing to the existing theory that these two forms of time judgement rely on separate underlying cognitive mechanisms. Finally, a significant main effect of task duration was found for both PoTJs and duration estimates (ps < .001). Participants underestimated the 58-minute task (m = 42.5 minutes) and overestimated the eight-minute task (m = 10.7 minutes). Yet, they reported the 58-minute task as passing significantly slower on a Likert scale (m = 2.5) compared to the eight-minute task (m = 4.1). In fact, a significant correlation was found between PoTJ and duration estimation (r = .27, p <.001). This experiment thus provides evidence for a compensatory effect at longer durations, in which people underestimate a ‘slow feeling condition and overestimate a ‘fast feeling condition. The results are discussed in relation to heuristics that might alter the relationship between these two variables when conditions range from several minutes up to almost an hour.

Keywords: duration estimates, long durations, passage of time judgements, task demands

Procedia PDF Downloads 108
1388 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing

Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou

Abstract:

The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.

Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation

Procedia PDF Downloads 84
1387 The Bespoke ‘Hybrid Virtual Fracture Clinic’ during the COVID-19 Pandemic: A Paradigm Shift?

Authors: Anirudh Sharma

Abstract:

Introduction: The Covid-19 pandemic necessitated a change in the manner outpatient fracture clinics are conducted due to the need to reduce footfall in hospital. While studies regarding virtual fracture clinics have shown these to be useful and effective, they focus exclusively on remote consultations. However, our service was bespoke to the patient – either a face-to-face or telephone consultation depending on patient need – a ‘hybrid virtual clinic (HVC).’ We report patient satisfaction and outcomes with this novel service. Methods: Patients booked onto our fracture clinics during the first 2 weeks of national lockdown were retrospectively contacted to assess the mode of consultations (virtual, face-to-face, or hybrid), patient experience, and outcome. Patient experience was assessed using the net promoter (NPS), customer effort (CES) and customer satisfaction scores (CSS), and their likelihood of using the HVC in the absence of a pandemic. Patient outcomes were assessed using the components of the EQ5D score. Results: Of 269 possible patients, 140 patients responded to the questionnaire. Of these, 66.4% had ‘hybrid’ consultations, 27.1% had only virtual consultations, and 6.4% had only face-to-face consultations. The mean overall NPS, CES, and CSS (on a scale of 1-10) were 7.27, 7.25, and 7.37, respectively. The mean likelihood of patients using the HVC in the absence of a pandemic was 6.5/10. Patients who had ‘hybrid’ consultations showed better effort scores and greater overall satisfaction than those with virtual consultations only and also reported superior EQ5D outcomes (mean 79.27 vs. 72.7). Patients who did not require surgery reported increased satisfaction (mean 7.51 vs. 7.08) and were more likely to use the HVC in the absence of a pandemic. Conclusion: Our study indicates that a bespoke HVC has good overall patient satisfaction and outcomes and is a better format of fracture clinic service than virtual consultations alone. It may be the preferred mode for fracture clinics in similar situations in the future. Further analysis needs to be conducted in order to explore the impact on resources and clinician experience of HVC in order to appreciate this new paradigm shift.

Keywords: hybrid virtual clinic, coronavirus, COVID-19, fracture clinic, remote consultation

Procedia PDF Downloads 114
1386 Spherical Harmonic Based Monostatic Anisotropic Point Scatterer Model for RADAR Applications

Authors: Eric Huang, Coleman DeLude, Justin Romberg, Saibal Mukhopadhyay, Madhavan Swaminathan

Abstract:

High performance computing (HPC) based emulators can be used to model the scattering from multiple stationary and moving targets for RADAR applications. These emulators rely on the RADAR Cross Section (RCS) of the targets being available in complex scenarios. Representing the RCS using tables generated from electromagnetic (EM) simulations is often times cumbersome leading to large storage requirement. This paper proposed a spherical harmonic based anisotropic scatterer model to represent the RCS of complex targets. The problem of finding the locations and reflection profiles of all scatterers can be formulated as a linear least square problem with a special sparsity constraint. This paper solves this problem using a modified Orthogonal Matching Pursuit algorithm. The results show that the spherical harmonic based scatterer model can effectively represent the RCS data of complex targets.

Keywords: RADAR, RCS, high performance computing, point scatterer model

Procedia PDF Downloads 169
1385 Using the M-Learning to Support Learning of the Concept of the Derivative

Authors: Elena F. Ruiz, Marina Vicario, Chadwick Carreto, Rubén Peredo

Abstract:

One of the main obstacles in Mexico’s engineering programs is math comprehension, especially in the Derivative concept. Due to this, we present a study case that relates Mobile Computing and Classroom Learning in the “Escuela Superior de Cómputo”, based on the Educational model of the Instituto Politécnico Nacional (competence based work and problem solutions) in which we propose apps and activities to teach the concept of the Derivative. M- Learning is emphasized as one of its lines, as the objective is the use of mobile devices running an app that uses its components such as sensors, screen, camera and processing power in classroom work. In this paper, we employed Augmented Reality (ARRoC), based on the good results this technology has had in the field of learning. This proposal was developed using a qualitative research methodology supported by quantitative research. The methodological instruments used on this proposal are: observation, questionnaires, interviews and evaluations. We obtained positive results with a 40% increase using M-Learning, from the 20% increase using traditional means.

Keywords: augmented reality, classroom learning, educational research, mobile computing

Procedia PDF Downloads 345
1384 A Paradigm Shift in Patent Protection-Protecting Methods of Doing Business: Implications for Economic Development in Africa

Authors: Odirachukwu S. Mwim, Tana Pistorius

Abstract:

Since the early 1990s political and economic pressures have been mounted on policy and law makers to increase patent protection by raising the protection standards. The perception of the relation between patent protection and development, particularly economic development, has evolved significantly in the past few years. Debate on patent protection in the international arena has been significantly influenced by the perception that there is a strong link between patent protection and economic development. The level of patent protection determines the extent of development that can be achieved. Recently there has been a paradigm shift with a lot of emphasis on extending patent protection to method of doing business generally referred to as Business Method Patenting (BMP). The general perception among international organizations and the private sectors also indicates that there is a strong correlation between BMP protection and economic growth. There are two diametrically opposing views as regards the relation between Intellectual Property (IP) protection and development and innovation. One school of thought promotes the view that IP protection improves economic development through stimulation of innovation and creativity. The other school advances the view that IP protection is unnecessary for stimulation of innovation and creativity and is in fact a hindrance to open access to resources and information required for innovative and creative modalities. Therefore, different theories and policies attach different levels of protection to BMP which have specific implications for economic growth. This study examines the impact of BMP protection on development by focusing on the challenges confronting economic growth in African communities as a result of the new paradigm in patent law. (Africa is used as a single unit in this study but this should not be construed as African homogeneity. Rather, the views advanced in this study are used to address the common challenges facing many communities in Africa). The study reviews (from the point of views of legal philosophers, policy makers and decisions of competent courts) the relevant literature, patent legislation particularly the International Treaty, policies and legal judgments. Findings from this study suggest that over and above the various criticisms levelled against the extreme liberal approach to the recognition of business methods as patentable subject matter, there are other specific implications that are associated with such approach. The most critical implication of extending patent protection to business methods is the locking-up of knowledge which may hamper human development in general and economic development in particular. Locking up knowledge necessary for economic advancement and competitiveness may have a negative effect on economic growth by promoting economic exclusion, particularly in African communities. This study suggests that knowledge of BMP within the African context and the extent of protection linked to it is crucial in achieving a sustainable economic growth in Africa. It also suggests that a balance is struck between the two diametrically opposing views.

Keywords: Africa, business method patenting, economic growth, intellectual property, patent protection

Procedia PDF Downloads 102
1383 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor

Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric

Abstract:

Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.

Keywords: car-detector, HOG, motion, computing time

Procedia PDF Downloads 302
1382 Some Conjectures and Programs about Computing the Detour Index of Molecular Graphs of Nanotubes

Authors: Shokofeh Ebrtahimi

Abstract:

Let G be the chemical graph of a molecule. The matrix D = [dij ] is called the detour matrix of G, if dij is the length of longest path between atoms i and j. The sum of all entries above the main diagonal of D is called the detour index of G.Chemical graph theory is the topology branch of mathematical chemistry which applies graph theory to mathematical modelling of chemical phenomena.[1] The pioneers of the chemical graph theory are Alexandru Balaban, Ante Graovac, Ivan Gutman, Haruo Hosoya, Milan Randić and Nenad TrinajstićLet G be the chemical graph of a molecule. The matrix D = [dij ] is called the detour matrix of G, if dij is the length of longest path between atoms i and j. The sum of all entries above the main diagonal of D is called the detour index of G. In this paper, a new program for computing the detour index of molecular graphs of nanotubes by heptagons is determineded. Some Conjectures about detour index of Molecular graphs of nanotubes is included.

Keywords: chemical graph, detour matrix, Detour index, carbon nanotube

Procedia PDF Downloads 262
1381 Teaching Computer Programming to Diverse Students: A Comparative, Mixed-Methods, Classroom Research Study

Authors: Almudena Konrad, Tomás Galguera

Abstract:

Lack of motivation and interest is a serious obstacle to students’ learning computing skills. A need exists for a knowledge base on effective pedagogy and curricula to teach computer programming. This paper presents results from research evaluating a six-year project designed to teach complex concepts in computer programming collaboratively, while supporting students to continue developing their computer thinking and related coding skills individually. Utilizing a quasi-experimental, mixed methods design, the pedagogical approaches and methods were assessed in two contrasting groups of students with different socioeconomic status, gender, and age composition. Analyses of quantitative data from Likert-scale surveys and an evaluation rubric, combined with qualitative data from reflective writing exercises and semi-structured interviews yielded convincing evidence of the project’s success at both teaching and inspiring students.

Keywords: computational thinking, computing education, computer programming curriculum, logic, teaching methods

Procedia PDF Downloads 295
1380 The Effect of Initial Sample Size and Increment in Simulation Samples on a Sequential Selection Approach

Authors: Mohammad H. Almomani

Abstract:

In this paper, we argue the effect of the initial sample size, and the increment in simulation samples on the performance of a sequential approach that used in selecting the top m designs when the number of alternative designs is very large. The sequential approach consists of two stages. In the first stage the ordinal optimization is used to select a subset that overlaps with the set of actual best k% designs with high probability. Then in the second stage the optimal computing budget is used to select the top m designs from the selected subset. We apply the selection approach on a generic example under some parameter settings, with a different choice of initial sample size and the increment in simulation samples, to explore the impacts on the performance of this approach. The results show that the choice of initial sample size and the increment in simulation samples does affect the performance of a selection approach.

Keywords: Large Scale Problems, Optimal Computing Budget Allocation, ordinal optimization, simulation optimization

Procedia PDF Downloads 327
1379 Analysing the Renewable Energy Integration Paradigm in the Post-COVID-19 Era: An Examination of the Upcoming Energy Law of China

Authors: Lan Wu

Abstract:

The declared transformation towards a ‘new electricity system dominated by renewable energy’ by China requires a cleaner electricity consumption mix with high shares of renewable energy sourced-electricity (RES-E). Unfortunately, integration of RES-E into Chinese electricity markets remains a problem pending more robust legal support, evidenced by the curtailment of wind and solar power as a consequence of integration constraints. The upcoming energy law of the PRC (energy law) is expected to provide such long-awaiting support and coordinate the existing diverse sector-specific laws to deal with the weak implementation that dampening the delivery of their desired regulatory effects. However, in the shadow of the COVID-19 crisis, it remains uncertain how this new energy law brings synergies to RES-E integration, mindful of the significant impacts of the pandemic. Through the theoretical lens of the interplay between China’s electricity reform and legislative development, the present paper investigates whether there is a paradigm shift in energy law regarding renewable energy integration compared with the existing sector-specific energy laws. It examines the 2020 draft for comments on the energy law and analyses its relationship with sector-specific energy laws focusing on RES-E integration. The comparison is drawn upon five key aspects of the RES-E integration issue, including the status of renewables, marketisation, incentive schemes, consumption mechanisms, access to power grids, and dispatching. The analysis shows that it is reasonable to expect a more open and well-organized electricity market enabling absorption of high shares of RES-E. The present paper concludes that a period of prosperous development of RES-E in the post-COVID-19 era can be anticipated with the legal support by the upcoming energy law. It contributes to understanding the signals China is sending regarding the transition towards a cleaner energy future.

Keywords: energy law, energy transition, electricity market reform, renewable energy integration

Procedia PDF Downloads 175
1378 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping

Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa

Abstract:

The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.

Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories

Procedia PDF Downloads 259
1377 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 344
1376 Pod and Wavelets Application for Aerodynamic Design Optimization

Authors: Bonchan Koo, Junhee Han, Dohyung Lee

Abstract:

The research attempts to evaluate the accuracy and efficiency of a design optimization procedure which combines wavelets-based solution algorithm and proper orthogonal decomposition (POD) database management technique. Aerodynamic design procedure calls for high fidelity computational fluid dynamic (CFD) simulations and the consideration of large number of flow conditions and design constraints. Even with significant computing power advancement, current level of integrated design process requires substantial computing time and resources. POD reduces the degree of freedom of full system through conducting singular value decomposition for various field simulations. For additional efficiency improvement of the procedure, adaptive wavelet technique is also being employed during POD training period. The proposed design procedure was applied to the optimization of wing aerodynamic performance. Throughout the research, it was confirmed that the POD/wavelets design procedure could significantly reduce the total design turnaround time and is also able to capture all detailed complex flow features as in full order analysis.

Keywords: POD (Proper Orthogonal Decomposition), wavelets, CFD, design optimization, ROM (Reduced Order Model)

Procedia PDF Downloads 448
1375 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research

Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez

Abstract:

Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.

Keywords: action research, information security, information technology, methodological design, process virtualization, risk management

Procedia PDF Downloads 143
1374 R Data Science for Technology Management

Authors: Sunghae Jun

Abstract:

Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.

Keywords: technology management, R system, R data science, statistics, machine learning

Procedia PDF Downloads 437
1373 Re-thinking Trust in Refugee Resettlement: A Contextual Perspective and Proposal for Reciprocal Integration

Authors: Mahfoudha Sid'Elemine

Abstract:

The refugee resettlement process profoundly shapes the trajectories of individuals in their new host countries, exerting lasting effects on their long-term integration. Prevailing literature underscores the pivotal role of trust in facilitating successful refugee resettlement. However, this research challenges the notion of trust as universally paramount, contending that its significance is contingent upon variables such as the nature of resettlement programs and the diverse backgrounds and perspectives of refugees. Rather than advocating for a blanket approach to trust-building, this research contends that for certain resettlement programs, trust may prove counterproductive amidst resource constraints and tight service timelines. Moreover, trust may not uniformly emerge as a primary requisite for all refugees, presenting formidable challenges in its establishment. Focusing specifically on resettlement in the United States, this study illustrates how the temporal constraints of resettlement services, coupled with refugees' varied cultural experiences, can impede the cultivation of trust between aid workers and refugees. As an alternative paradigm, this research proposes an approach centered on fostering opportunities for reciprocal engagement, positioning refugees as active contributors within their newfound communities. Embracing reciprocity as the cornerstone of burgeoning relationships promises to fortify refugees' ties with the broader community, bolster their autonomy, and facilitate sustained integration over time. The research draws upon qualitative analyses of in-depth interviews conducted with a subset of resettled refugees, as well as aid workers and volunteers involved in refugee resettlement endeavors within Hampton Roads, Virginia, over the past decade. Through this nuanced examination, the study offers insights into the complexities of trust dynamics in refugee resettlement contexts and advocates for a paradigm shift towards reciprocal integration strategies.

Keywords: Resettlement programs, Trust dynamics, Reciprocity, Long-term integration

Procedia PDF Downloads 6
1372 The Impact of Organizational Culture on Internet Marketing Adoption

Authors: Hafiz Mushtaq Ahmad, Syed Faizan Ali Shah, Bushra Hussain, Muneeb Iqbal

Abstract:

Purpose: The purpose of this study is to investigate the impact of organizational culture on internet marketing adoption. Moreover, the study intends to explore the role of organizational culture in the internet marketing adoption that helps business to achieve organizational growth and augmented market share. Background: With the enormous expansion of technology, organizations now need technology-based marketing paradigm in order to capture larger group of customers. Organizational culture plays a dominant and prominent role in the internet marketing adoption. Changes in the world economy have demolished current organizational competition and generating new technology standards and strategies. With all the technological advances, e-marketing has become one of the essential part of marketing strategies. Organizations require advance internet marketing strategies in order to compete in a global market. Methodology: The population of this study consists of telecom sector organizations of Pakistan. The sample size consists of 200 telecom sector employees. Data were gathered through the questionnaire instrument. The research strategy of this study is survey. The study uses a deductive approach. The sampling technique of this study is convenience sampling. Tentative Results: The study reveals that organizational culture played a vital role in the internet marketing adoption. The results show that there is a strong association between the organizational culture and internet marketing adoption. The results further show that flexible organizational culture helps organization to easily adopt internet marketing. Conclusion: The study discloses that flexible organizational culture helps organizations to easily adopt e-marketing. The study guides decision-makers and owners of organizations to recognize the importance of internet marketing strategy and help them to increase market share by using e-marketing. The study offers solution to the managers to develop flexible organizational culture that helps in internet marketing adoption.

Keywords: internet technology, internet marketing, marketing paradigm, organizational culture

Procedia PDF Downloads 212
1371 Short-Term Effects of an Open Monitoring Meditation on Cognitive Control and Information Processing

Authors: Sarah Ullrich, Juliane Rolle, Christian Beste, Nicole Wolff

Abstract:

Inhibition and cognitive flexibility are essential parts of executive functions in our daily lives, as they enable the avoidance of unwanted responses or selectively switch between mental processes to generate appropriate behavior. There is growing interest in improving inhibition and response selection through brief mindfulness-based meditations. Arguably, open-monitoring meditation (OMM) improves inhibitory and flexibility performance by optimizing cognitive control and information processing. Yet, the underlying neurophysiological processes have been poorly studied. Using the Simon-Go/Nogo paradigm, the present work examined the effect of a single 15-minute smartphone app-based OMM on inhibitory performance and response selection in meditation novices. We used both behavioral and neurophysiological measures (event-related potentials, ERPs) to investigate which subprocesses of response selection and inhibition are altered after OMM. The study was conducted in a randomized crossover design with N = 32 healthy adults. We thereby investigated Go and Nogo trials in the paradigm. The results show that as little as 15 minutes of OMM can improve response selection and inhibition at behavioral and neurophysiological levels. More specifically, OMM reduces the rate of false alarms, especially during Nogo trials regardless of congruency. It appears that OMM optimizes conflict processing and response inhibition compared to no meditation, also reflected in the ERP N2 and P3 time windows. The results may be explained by the meta control model, which argues in terms of a specific processing mode with increased flexibility and inclusive decision-making under OMM. Importantly, however, the effects of OMM were only evident when there was the prior experience with the task. It is likely that OMM provides more cognitive resources, as the amplitudes of these EKPs decreased. OMM novices seem to induce finer adjustments during conflict processing after familiarization with the task.

Keywords: EEG, inhibition, meditation, Simon Nogo

Procedia PDF Downloads 183
1370 Platform-as-a-Service Sticky Policies for Privacy Classification in the Cloud

Authors: Maha Shamseddine, Amjad Nusayr, Wassim Itani

Abstract:

In this paper, we present a Platform-as-a-Service (PaaS) model for controlling the privacy enforcement mechanisms applied on user data when stored and processed in Cloud data centers. The proposed architecture consists of establishing user configurable ‘sticky’ policies on the Graphical User Interface (GUI) data-bound components during the application development phase to specify the details of privacy enforcement on the contents of these components. Various privacy classification classes on the data components are formally defined to give the user full control on the degree and scope of privacy enforcement including the type of execution containers to process the data in the Cloud. This not only enhances the privacy-awareness of the developed Cloud services, but also results in major savings in performance and energy efficiency due to the fact that the privacy mechanisms are solely applied on sensitive data units and not on all the user content. The proposed design is implemented in a real PaaS cloud computing environment on the Microsoft Azure platform.

Keywords: privacy enforcement, platform-as-a-service privacy awareness, cloud computing privacy

Procedia PDF Downloads 201
1369 Objectives of the Standardization of Technical Terminology Nowadays in Albanian

Authors: Gani Pllana

Abstract:

In the conditions of the rapid development of technics and technology in recent years, the cooperation of the scientific-technical language with the standard Albanian language is continuing with a higher intensity than before. We notice a vigor of enrichment in the vocabulary of technical terminology, due to the birth and formation of new fields and subfields of technics, technology, as computing, mechatronics, telemetry, a multitude of concepts many of which, on the one hand, are marked with names of the languages they come from, mainly from English, but on the other hand, they meet their needs with the lexical mother tongue composition (by common words being raised to terms) and with the activation of other layers, such as compound word terms. Thus, for example, in the field of computing, we notice in it the inclusion of the ordinary vocabulary for reproductive reasons, like mi, dritare, flamur, adresë, skedar (Engl: mouse, window, flag, address, file), and along with them, the compound word terms, serving to differentiate relevant concepts, like, adresë e hiperlidhjes, adresë e uebit, adresë relative, adresë virtuale (Engl. address hyperlink, web address, relative address, virtual address) etc.

Keywords: common words, Albanian language, technical terminology, standardization

Procedia PDF Downloads 272
1368 Improving System Performance through User's Resource Access Patterns

Authors: K. C. Wong

Abstract:

This paper demonstrates a number of examples in the hope to shed some light on the possibility of designing future operating systems in a more adaptation-based manner. A modern operating system, we conceive, should possess the capability of 'learning' in such a way that it can dynamically adjust its services and behavior according to the current status of the environment in which it operates. In other words, a modern operating system should play a more proactive role during the session of providing system services to users. As such, a modern operating system is expected to create a computing environment, in which its users are provided with system services more matching their dynamically changing needs. The examples demonstrated in this paper show that user's resource access patterns 'learned' and determined during a session can be utilized to improve system performance and hence to provide users with a better and more effective computing environment. The paper also discusses how to use the frequency, the continuity, and the duration of resource accesses in a session to quantitatively measure and determine user's resource access patterns for the examples shown in the paper.

Keywords: adaptation-based systems, operating systems, resource access patterns, system performance

Procedia PDF Downloads 115
1367 The Emancipatory Methodological Approach to the Organizational Problems Management

Authors: Slavica P. Petrovic

Abstract:

One of the key dimensions of management problems in organizations refers to the relations between stakeholders. The management problems that are characterized by conflict and coercion, in which participants do not agree on the ends and means, in which different groups, i.e., individuals, strive to – using the power they have – impose on others their favoured strategy and decisions represent the relevant research subject. Creatively managing the coercive problems in organizations, in which the sources of power can be identified, implies the emancipatory paradigm and the use of corresponding systems methodology. The main research aim is to critically reassess the theoretical foundations and methodological and methodical development of Critical Systems Heuristics (CSH) – as a valid representative of the emancipatory paradigm – in order to determine the conditions, ways, and achievements of its application in managing the coercive problems in organizations. The basic hypothesis is that CSH, as the emancipatory methodology, given its own theoretical foundations and methodological-methodical development, can be employed in a scientifically based and practically useful manner in creative addressing the coercive problems. The scientific instrumentarium corresponding to this research aim is critical systems thinking with its three key commitments to: a) Critical awareness of the strengths and weaknesses of each research instrument (theory, methodology, method, technique, model) for structuring the problem situations in organizations, b) Improvement of managing the coercive problems in organizations, and c) Pluralism – respect the different perceptions and interpretations of problem situations, and enable the combined use of research instruments. The relevant research result is that CSH – considering its theoretical foundations, methodological and methodical development – enables to reveal the normative content of the proposed or existing designs of organizational systems. Accordingly, it can be concluded that through the use of critically heuristic categories and dialectical debate between those involved and those affected by the designs, but who are not included in designing organizational systems, CSH endeavours to – in the application – support the process of improving position of all stakeholders.

Keywords: coercion and conflict in organizations, creative management, critical systems heuristics, the emancipatory systems methodology

Procedia PDF Downloads 420
1366 A Genetic Algorithm for the Load Balance of Parallel Computational Fluid Dynamics Computation with Multi-Block Structured Mesh

Authors: Chunye Gong, Ming Tie, Jie Liu, Weimin Bao, Xinbiao Gan, Shengguo Li, Bo Yang, Xuguang Chen, Tiaojie Xiao, Yang Sun

Abstract:

Large-scale CFD simulation relies on high-performance parallel computing, and the load balance is the key role which affects the parallel efficiency. This paper focuses on the load-balancing problem of parallel CFD simulation with structured mesh. A mathematical model for this load-balancing problem is presented. The genetic algorithm, fitness computing, two-level code are designed. Optimal selector, robust operator, and local optimization operator are designed. The properties of the presented genetic algorithm are discussed in-depth. The effects of optimal selector, robust operator, and local optimization operator are proved by experiments. The experimental results of different test sets, DLR-F4, and aircraft design applications show the presented load-balancing algorithm is robust, quickly converged, and is useful in real engineering problems.

Keywords: genetic algorithm, load-balancing algorithm, optimal variation, local optimization

Procedia PDF Downloads 146