Search results for: Analytic Network Process (ANP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18937

Search results for: Analytic Network Process (ANP)

15427 Music Piracy Revisited: Agent-Based Modelling and Simulation of Illegal Consumption Behavior

Authors: U. S. Putro, L. Mayangsari, M. Siallagan, N. P. Tjahyani

Abstract:

National Collective Management Institute (LKMN) in Indonesia stated that legal music products were about 77.552.008 unit while illegal music products were about 22.0688.225 unit in 1996 and this number keeps getting worse every year. Consequently, Indonesia named as one of the countries with high piracy levels in 2005. This study models people decision toward unlawful behavior, music content piracy in particular, using agent-based modeling and simulation (ABMS). The classification of actors in the model constructed in this study are legal consumer, illegal consumer, and neutral consumer. The decision toward piracy among the actors is a manifestation of the social norm which attributes are social pressure, peer pressure, social approval, and perceived prevalence of piracy. The influencing attributes fluctuate depending on the majority of surrounding behavior called social network. There are two main interventions undertaken in the model, campaign and peer influence, which leads to scenarios in the simulation: positively-framed descriptive norm message, negatively-framed descriptive norm message, positively-framed injunctive norm with benefits message, and negatively-framed injunctive norm with costs message. Using NetLogo, the model is simulated in 30 runs with 10.000 iteration for each run. The initial number of agent was set 100 proportion of 95:5 for illegal consumption. The assumption of proportion is based on the data stated that 95% sales of music industry are pirated. The finding of this study is that negatively-framed descriptive norm message has a worse reversed effect toward music piracy. The study discovers that selecting the context-based campaign is the key process to reduce the level of intention toward music piracy as unlawful behavior by increasing the compliance awareness. The context of Indonesia reveals that that majority of people has actively engaged in music piracy as unlawful behavior, so that people think that this illegal act is common behavior. Therefore, providing the information about how widespread and big this problem is could make people do the illegal consumption behavior instead. The positively-framed descriptive norm message scenario works best to reduce music piracy numbers as it focuses on supporting positive behavior and subject to the right perception on this phenomenon. Music piracy is not merely economical, but rather social phenomenon due to the underlying motivation of the actors which has shifted toward community sharing. The indication of misconception of value co-creation in the context of music piracy in Indonesia is also discussed. This study contributes theoretically that understanding how social norm configures the behavior of decision-making process is essential to breakdown the phenomenon of unlawful behavior in music industry. In practice, this study proposes that reward-based and context-based strategy is the most relevant strategy for stakeholders in music industry. Furthermore, this study provides an opportunity that findings may generalize well beyond music piracy context. As an emerging body of work that systematically constructs the backstage of law and social affect decision-making process, it is interesting to see how the model is implemented in other decision-behavior related situation.

Keywords: music piracy, social norm, behavioral decision-making, agent-based model, value co-creation

Procedia PDF Downloads 178
15426 The Role of Emotions in the Consumer: Theoretical Review and Analysis of Components

Authors: Mikel Alonso López

Abstract:

The early eighties saw the rise of a new research trend in several prestigious journals, mainly articles that related emotions with the decision-making processes of the consumer, and stopped treating them as external elements. That is why we ask questions such as: what are emotions? Are there different types of emotions? What components do they have? Which theories exist about them? In this study, we will review the main theories and components of emotion analysing the cognitive factor and the different emotional states that are generally recognizable with a focus in the classic debate as to whether they occur before the cognitive process or the affective process.

Keywords: emotion, consumer behaviour, feelings, decision making

Procedia PDF Downloads 330
15425 Reduction of the Number of Traffic Accidents by Function of Driver's Anger Detection

Authors: Masahiro Miyaji

Abstract:

When a driver happens to be involved in some traffic congestion or after traffic incidents, the driver may fall in a state of anger. State of anger may encounter decisive risk resulting in severer traffic accidents. Preventive safety function using driver’s psychosomatic state with regard to anger may be one of solutions which would avoid that kind of risks. Identifying driver’s anger state is important to create countermeasures to prevent the risk of traffic accidents. As a first step, this research figured out root cause of traffic incidents by means of using Internet survey. From statistical analysis of the survey, dominant psychosomatic states immediately before traffic incidents were haste, distraction, drowsiness and anger. Then, we replicated anger state of a driver while driving, and then, replicated it by means of using driving simulator on bench test basis. Six types of facial expressions including anger were introduced as alternative characteristics. Kohonen neural network was adopted to classify anger state. Then, we created a methodology to detect anger state of a driver in high accuracy. We presented a driving support safety function. The function adapts driver’s anger state in cooperation with an autonomous driving unit to reduce the number of traffic accidents. Consequently, e evaluated reduction rate of driver’s anger in the traffic accident. To validate the estimation results, we referred the reduction rate of Advanced Safety Vehicle (ASV) as well as Intelligent Transportation Systems (ITS).

Keywords: Kohonen neural network, driver’s anger state, reduction of traffic accidents, driver’s state adaptive driving support safety

Procedia PDF Downloads 347
15424 Cooling-Rate Induced Fiber Birefringence Variation in Regenerated High Birefringent Fiber

Authors: Man-Hong Lai, Dinusha S. Gunawardena, Kok-Sing Lim, Harith Ahmad

Abstract:

In this paper, we have reported birefringence manipulation in regenerated high-birefringent fiber Bragg grating (RPMG) by using CO2 laser annealing method. The results indicate that the birefringence of RPMG remains unchanged after CO2 laser annealing followed by a slow cooling process, but reduced after the fast cooling process (~5.6×10-5). After a series of annealing procedures with different cooling rates, the obtained results show that slower the cooling rate, higher the birefringence of RPMG. The volume, thermal expansion coefficient (TEC) and glass transition temperature (Tg) change of stress applying part in RPMG during the cooling process are responsible for the birefringence change. Therefore, these findings are important to the RPMG sensor in high and dynamic temperature environment. The measuring accuracy, range and sensitivity of RPMG sensor are greatly affected by its birefringence value. This work also opens up a new application of CO2 laser for fiber annealing and birefringence modification.

Keywords: birefringence, CO2 laser annealing, regenerated gratings, thermal stress

Procedia PDF Downloads 451
15423 Real-Time Compressive Strength Monitoring for NPP Concrete Construction Using an Embedded Piezoelectric Self-Sensing Technique

Authors: Junkyeong Kim, Seunghee Park, Ju-Won Kim, Myung-Sug Cho

Abstract:

Recently, demands for the construction of Nuclear Power Plants (NPP) using high strength concrete (HSC) has been increased. However, HSC might be susceptible to brittle fracture if the curing process is inadequate. To prevent unexpected collapse during and after the construction of HSC structures, it is essential to confirm the strength development of HSC during the curing process. However, several traditional strength-measuring methods are not effective and practical. In this study, a novel method to estimate the strength development of HSC based on electromechanical impedance (EMI) measurements using an embedded piezoelectric sensor is proposed. The EMI of NPP concrete specimen was tracked to monitor the strength development. In addition, cross-correlation coefficient was applied in sequence to examine the trend of the impedance variations more quantitatively. The results confirmed that the proposed technique can be applied successfully monitoring of the strength development during the curing process of HSC structures.

Keywords: concrete curing, embedded piezoelectric sensor, high strength concrete, nuclear power plant, self-sensing impedance

Procedia PDF Downloads 497
15422 Aligning Informatics Study Programs with Occupational and Qualifications Standards

Authors: Patrizia Poscic, Sanja Candrlic, Danijela Jaksic

Abstract:

The University of Rijeka, Department of Informatics participated in the Stand4Info project, co-financed by the European Union, with the main idea of an alignment of study programs with occupational and qualifications standards in the field of Informatics. A brief overview of our research methodology, goals and deliverables is shown. Our main research and project objectives were: a) development of occupational standards, qualification standards and study programs based on the Croatian Qualifications Framework (CROQF), b) higher education quality improvement in the field of information and communication sciences, c) increasing the employability of students of information and communication technology (ICT) and science, and d) continuously improving competencies of teachers in accordance with the principles of CROQF. CROQF is a reform instrument in the Republic of Croatia for regulating the system of qualifications at all levels through qualifications standards based on learning outcomes and following the needs of the labor market, individuals and society. The central elements of CROQF are learning outcomes - competences acquired by the individual through the learning process and proved afterward. The place of each acquired qualification is set by the level of the learning outcomes belonging to that qualification. The placement of qualifications at respective levels allows the comparison and linking of different qualifications, as well as linking of Croatian qualifications' levels to the levels of the European Qualifications Framework and the levels of the Qualifications framework of the European Higher Education Area. This research has made 3 proposals of occupational standards for undergraduate study level (System Analyst, Developer, ICT Operations Manager), and 2 for graduate (master) level (System Architect, Business Architect). For each occupational standard employers have provided a list of key tasks and associated competencies necessary to perform them. A set of competencies required for each particular job in the workplace was defined and each set of competencies as described in more details by its individual competencies. Based on sets of competencies from occupational standards, sets of learning outcomes were defined and competencies from the occupational standard were linked with learning outcomes. For each learning outcome, as well as for the set of learning outcomes, it was necessary to specify verification method, material, and human resources. The task of the project was to suggest revision and improvement of the existing study programs. It was necessary to analyze existing programs and determine how they meet and fulfill defined learning outcomes. This way, one could see: a) which learning outcomes from the qualifications standards are covered by existing courses, b) which learning outcomes have yet to be covered, c) are they covered by mandatory or elective courses, and d) are some courses unnecessary or redundant. Overall, the main research results are: a) completed proposals of qualification and occupational standards in the field of ICT, b) revised curricula of undergraduate and master study programs in ICT, c) sustainable partnership and association stakeholders network, d) knowledge network - informing the public and stakeholders (teachers, students, and employers) about the importance of CROQF establishment, and e) teachers educated in innovative methods of teaching.

Keywords: study program, qualification standard, occupational standard, higher education, informatics and computer science

Procedia PDF Downloads 130
15421 English Language Acquisition and Flipped Classroom

Authors: Yuqing Sun

Abstract:

Nowadays, English has been taught in many countries as a second language. One of the major ways to learn this language is through the class teaching. As in the field of second language acquisition, there are many factors to affect its acquisition processes, such as the target language itself, a learner’s personality, cognitive factor, language transfer, and the outward factors (teaching method, classroom, environmental factor, teaching policy, social environment and so on). Flipped Classroom as a newly developed classroom model has been widely used in language teaching classroom, which was, to some extent, accepted by teachers and students for its effect. It distinguishes itself from the traditional classroom for its focus on the learner and its great importance attaching to the personal learning process and the application of technology. The class becomes discussion-targeted, and the class order is somewhat inverted since the teaching process is carried out outside the class, while the class is only for knowledge-internalization. This paper will concentrate on the influences of the flipped classroom, as a classroom affecting factor, on the the process of English acquisition by the way of case studies (English teaching class in China), and the analysis of the mechanism of the flipped classroom itself to propose some feasible advice of promoting the the effectiveness of English acquisition.

Keywords: second language acquisition, English, flipped classroom, case

Procedia PDF Downloads 385
15420 The Change in Management Accounting from an Institutional Perspective: A Case Study for a Romania Company

Authors: Gabriel Jinga, Madalina Dumitru

Abstract:

The objective of this paper is to present the process of change in management accounting in Romania, a former communist country from Eastern Europe. In order to explain this process, we used the contingency and institutional theories. We focused on the following directions: the presentation of the scientific context and motivation of this research and the case study. We presented the state of the art in the process of change in the management accounting from the international and national perspective. We also described the evolution of management accounting in Romania in the context of economic and political changes. An important moment was the fall of communism in 1989. This represents a starting point for a new economic environment and for new management accounting. Accordingly, we developed a case study which presented this evolution. The conclusion of our research was that the changes in the management accounting system of the company analysed occurred in the same time with the institutionalization of some elements (e.g. degree of competition, training and competencies in management accounting). The management accounting system was modeled by the contingencies specific to this company (e.g. environment, industry, strategy).

Keywords: management accounting, change, Romania, contingency, institutional theory

Procedia PDF Downloads 496
15419 Membrane Bioreactor versus Activated Sludge Process for Aerobic Wastewater Treatment and Recycling

Authors: Sarra Kitanou

Abstract:

Membrane bioreactor (MBR) systems are one of the most widely used wastewater treatment processes for various municipal and industrial waste streams. It is based on complex interactions between biological processes, filtration process and rheological properties of the liquid to be treated. Its complexity makes understanding system operation and optimization more difficult, and traditional methods based on experimental analysis are costly and time consuming. The present study was based on an external membrane bioreactor pilot scale with ceramic membranes compared to conventional activated sludge process (ASP) plant. Both systems received their influent from a domestic wastewater. The membrane bioreactor (MBR) produced an effluent with much better quality than ASP in terms of total suspended solids (TSS), organic matter such as biological oxygen demand (BOD) and chemical oxygen demand (COD), total Phosphorus and total Nitrogen. Other effluent quality parameters also indicate substantial differences between ASP and MBR. This study leads to conclude that in the case domestic wastewater, MBR treatment has excellent effluent quality. Hence, the replacement of the ASP by the MBRs may be justified on the basis of their improved removal of solids, nutrients, and micropollutants. Furthermore, in terms of reuse the great quality of the treated water allows it to be reused for irrigation.

Keywords: aerobic wastewater treatment, conventional activated sludge process, membrane bioreactor, reuse for irrigation

Procedia PDF Downloads 65
15418 Degradation of Emerging Pharmaceuticals by Gamma Irradiation Process

Authors: W. Jahouach-Rabai, J. Aribi, Z. Azzouz-Berriche, R. Lahsni, F. Hosni

Abstract:

Gamma irradiation applied in removing pharmaceutical contaminants from wastewater is an effective advanced oxidation process (AOP), considered as an alternative to conventional water treatment technologies. In this purpose, the degradation efficiency of several detected contaminants under gamma irradiation was evaluated. In fact, radiolysis of organic pollutants in aqueous solutions produces powerful reactive species, essentially hydroxyl radical ( ·OH), able to destroy recalcitrant pollutants in water. Pharmaceuticals considered in this study are aqueous solutions of paracetamol, ibuprofen, and diclofenac at different concentrations 0.1-1 mmol/L, which were treated with irradiation doses from 3 to 15 kGy. The catalytic oxidation of these compounds by gamma irradiation was investigated using hydrogen peroxide (H₂O₂) as a convenient oxidant. Optimization of the main parameters influencing irradiation process, namely irradiation doses, initial concentration and oxidant volume (H₂O₂) were investigated, in the aim to release high degradation efficiency of considered pharmaceuticals. Significant modifications attributed to these parameters appeared in the variation of degradation efficiency, chemical oxygen demand removal (COD) and concentration of radio-induced radicals, confirming them synergistic effect to attempt total mineralization. Pseudo-first-order reaction kinetics could be used to depict the degradation process of these compounds. A sophisticated analytical study was released to quantify the detected radio-induced radicals (electron paramagnetic resonance spectroscopy (EPR) and high performance liquid chromatography (HPLC)). All results showed that this process is effective for the degradation of many pharmaceutical products in aqueous solutions due to strong oxidative properties of generated radicals mainly hydroxyl radical. Furthermore, the addition of an optimal amount of H₂O₂ was efficient to improve the oxidative degradation and contribute to the high performance of this process at very low doses (0.5 and 1 kGy).

Keywords: AOP, COD, hydroxyl radical, EPR, gamma irradiation, HPLC, pharmaceuticals

Procedia PDF Downloads 158
15417 The Acceptable Roles of Artificial Intelligence in the Judicial Reasoning Process

Authors: Sonia Anand Knowlton

Abstract:

There are some cases where we as a society feel deeply uncomfortable with the use of Artificial Intelligence (AI) tools in the judicial decision-making process, and justifiably so. A perfect example is COMPAS, an algorithmic model that predicts recidivism rates of offenders to assist in the determination of their bail conditions. COMPAS turned out to be extremely racist: it massively overpredicted recidivism rates of Black offenders and underpredicted recidivism rates of white offenders. At the same time, there are certain uses of AI in the judicial decision-making process that many would feel more comfortable with and even support. Take, for example, a “super-breathalyzer,” an (albeit imaginary) tool that uses AI to deliver highly detailed information about the subject of the breathalyzer test to the legal decision-makers analyzing their drunk-driving case. This article evaluates the point at which a judge’s use of AI tools begins to undermine the public’s trust in the administration of justice. It argues that the answer to this question depends on whether the AI tool is in a role in which it must perform a moral evaluation of a human being.

Keywords: artificial intelligence, judicial reasoning, morality, technology, algorithm

Procedia PDF Downloads 56
15416 A Location-based Authentication and Key Management Scheme for Border Surveillance Wireless Sensor Networks

Authors: Walid Abdallah, Noureddine Boudriga

Abstract:

Wireless sensor networks have shown their effectiveness in the deployment of many critical applications especially in the military domain. Border surveillance is one of these applications where a set of wireless sensors are deployed along a country border line to detect illegal intrusion attempts to the national territory and report this to a control center to undergo the necessary measures. Regarding its nature, this wireless sensor network can be the target of many security attacks trying to compromise its normal operation. Particularly, in this application the deployment and location of sensor nodes are of great importance for detecting and tracking intruders. This paper proposes a location-based authentication and key distribution mechanism to secure wireless sensor networks intended for border surveillance where the key establishment is performed using elliptic curve cryptography and identity-based public key scheme. In this scheme, the public key of each sensor node will be authenticated by keys that depend on its position in the monitored area. Before establishing a pairwise key between two nodes, each one of them must verify the neighborhood location of the other node using a message authentication code (MAC) calculated on the corresponding public key and keys derived from encrypted beacon messages broadcast by anchor nodes. We show that our proposed public key authentication and key distribution scheme is more resilient to node capture and node replication attacks than currently available schemes. Also, the achievement of the key distribution between nodes in our scheme generates less communication overhead and hence increases network performances.

Keywords: wireless sensor networks, border surveillance, security, key distribution, location-based

Procedia PDF Downloads 644
15415 [Keynote Talk]: Caught in the Tractorbeam of Larger Influences: The Filtration of Innovation in Education Technology Design

Authors: Justin D. Olmanson, Fitsum Abebe, Valerie Jones, Eric Kyle, Xianquan Liu, Katherine Robbins, Guieswende Rouamba

Abstract:

The history of education technology--and designing, adapting, and adopting technologies for use in educational spaces--is nuanced, complex, and dynamic. Yet, despite a range of continually emerging technologies, the design and development process often yields results that appear quite similar in terms of affordances and interactions. Through this study we (1) verify the extent to which designs have been constrained, (2) consider what might account for it, and (3) offer a way forward in terms of how we might identify and strategically sidestep these influences--thereby increasing the diversity of our designs with a given technology or within a particular learning domain. We begin our inquiry from the perspective that a host of co-influencing elements, fields, and meta narratives converge on the education technology design process to exert a tangible, often homogenizing effect on the resultant designs. We identify several elements that influence design in often implicit or unquestioned ways (e.g. curriculum, learning theory, economics, learning context, pedagogy), we describe our methodology for identifying the elemental positionality embedded in a design, we direct our analysis to a particular subset of technologies in the field of literacy, and unpack our findings. Our early analysis suggests that the majority of education technologies designed for use/used in US public schools are heavily influenced by a handful of mainstream theories and meta narratives. These findings have implications for how we approach the education technology design process--which we use to suggest alternative methods for designing/ developing with emerging technologies. Our analytical process and re conceptualized design process hold the potential to diversify the ways emerging and established technologies get incorporated into our designs.

Keywords: curriculum, design, innovation, meta narratives

Procedia PDF Downloads 496
15414 Architectural Design Studio (ADS) as an Operational Synthesis in Architectural Education

Authors: Francisco A. Ribeiro Da Costa

Abstract:

Who is responsible for teaching architecture; consider various ways to participate in learning, manipulating various pedagogical tools to streamline the creative process. The Architectural Design Studio (ADS) should become a holistic, systemic process responding to the complexity of our world. This essay corresponds to a deep reflection developed by the author on the teaching of architecture. The outcomes achieved are the corollary of experimentation; discussion and application of pedagogical methods that allowed consolidate the creativity applied by students. The purpose is to show the conjectures that have been considered effective in creating an intellectual environment that nurtures the subject of Architectural Design Studio (ADS), as an operational synthesis in the final stage of the degree. These assumptions, which are part of the proposed model, displaying theories and teaching methodologies that try to respect the learning process based on student learning styles Kolb, ensuring their latent specificities and formulating the structure of the ASD discipline. In addition, the assessing methods are proposed, which consider the architectural Design Studio as an operational synthesis in the teaching of architecture.

Keywords: teaching-learning, architectural design studio, architecture, education

Procedia PDF Downloads 374
15413 Denoising Convolutional Neural Network Assisted Electrocardiogram Signal Watermarking for Secure Transmission in E-Healthcare Applications

Authors: Jyoti Rani, Ashima Anand, Shivendra Shivani

Abstract:

In recent years, physiological signals obtained in telemedicine have been stored independently from patient information. In addition, people have increasingly turned to mobile devices for information on health-related topics. Major authentication and security issues may arise from this storing, degrading the reliability of diagnostics. This study introduces an approach to reversible watermarking, which ensures security by utilizing the electrocardiogram (ECG) signal as a carrier for embedding patient information. In the proposed work, Pan-Tompkins++ is employed to convert the 1D ECG signal into a 2D signal. The frequency subbands of a signal are extracted using RDWT(Redundant discrete wavelet transform), and then one of the subbands is subjected to MSVD (Multiresolution singular valued decomposition for masking. Finally, the encrypted watermark is embedded within the signal. The experimental results show that the watermarked signal obtained is indistinguishable from the original signals, ensuring the preservation of all diagnostic information. In addition, the DnCNN (Denoising convolutional neural network) concept is used to denoise the retrieved watermark for improved accuracy. The proposed ECG signal-based watermarking method is supported by experimental results and evaluations of its effectiveness. The results of the robustness tests demonstrate that the watermark is susceptible to the most prevalent watermarking attacks.

Keywords: ECG, VMD, watermarking, PanTompkins++, RDWT, DnCNN, MSVD, chaotic encryption, attacks

Procedia PDF Downloads 79
15412 Deep Learning-Based Object Detection on Low Quality Images: A Case Study of Real-Time Traffic Monitoring

Authors: Jean-Francois Rajotte, Martin Sotir, Frank Gouineau

Abstract:

The installation and management of traffic monitoring devices can be costly from both a financial and resource point of view. It is therefore important to take advantage of in-place infrastructures to extract the most information. Here we show how low-quality urban road traffic images from cameras already available in many cities (such as Montreal, Vancouver, and Toronto) can be used to estimate traffic flow. To this end, we use a pre-trained neural network, developed for object detection, to count vehicles within images. We then compare the results with human annotations gathered through crowdsourcing campaigns. We use this comparison to assess performance and calibrate the neural network annotations. As a use case, we consider six months of continuous monitoring over hundreds of cameras installed in the city of Montreal. We compare the results with city-provided manual traffic counting performed in similar conditions at the same location. The good performance of our system allows us to consider applications which can monitor the traffic conditions in near real-time, making the counting usable for traffic-related services. Furthermore, the resulting annotations pave the way for building a historical vehicle counting dataset to be used for analysing the impact of road traffic on many city-related issues, such as urban planning, security, and pollution.

Keywords: traffic monitoring, deep learning, image annotation, vehicles, roads, artificial intelligence, real-time systems

Procedia PDF Downloads 180
15411 Polyethylene Terephthalate (PET) Fabrics Decoloring for PET Textile Recycle

Authors: Chung-Yang Chuang, Hui-Min Wang, Min-Yan Dong, Chang-Jung Chang

Abstract:

PET fiber is the most widely used fiber worldwide. This man-made fiber is prepared from petroleum chemicals, which may cause environmental pollution and resource exhausting issues, such as the use of non-renewable sources, greenhouse gas emission and discharge of wastewater. Therefore, the textile made by recycle-PET is the trend in the future. Recycle-PET fiber, compared with petroleum-made PET, shows lower carbon emissions and resource exhaustion. However, “fabric decoloring” is the key barrier to textile recycling. The dyes existing in the fabrics may cause PET chain degradation and appearance drawbacks during the textile recycling process. In this research, the water-based decoloring agent was used to remove the dispersed dye in the PET fabrics in order to obtain the colorless PET fabrics after the decoloring process. The decoloring rate of PET fabrics after the decoloring process was up to 99.0%. This research provides a better solution to resolve the issues of appearance and physical properties degradation of fabrics-recycle PET materials due to the residual dye. It may be possible to convert waste PET textiles into new high-quality PET fiber and build up the loop of PET textile recycling.

Keywords: PET, decoloring, disperse dye, textile recycle

Procedia PDF Downloads 122
15410 Leadership, A Toll to Support Innovations and Inventive Education at Universities

Authors: Peter Balco, Miriam Filipova

Abstract:

The university education is generally concentrated on acquiring theoretical as well as professional knowledge. The right mix of these knowledges is key in creating innovative as well as inventive solutions. Despite the understanding of their importance by the professional community, these are promoted with problems and misunderstanding. The reason for the failure of many non-traditional, innovative approaches is the ignorance of Leadership in the process of their implementation, ie decision-making. In our paper, we focused on the role of Leadership in the educational process and how this knowledge can support decision-making, the selection of a suitable, optimal solution for practice.

Keywords: leadership, soft skills, innovation, invention, knowledge

Procedia PDF Downloads 175
15409 Iterative Design Process for Development and Virtual Commissioning of Plant Control Software

Authors: Thorsten Prante, Robert Schöch, Ruth Fleisch, Vaheh Khachatouri, Alexander Walch

Abstract:

The development of industrial plant control software is a complex and often very expensive task. One of the core problems is that a lot of the implementation and adaptation work can only be done after the plant hardware has been installed. In this paper, we present our approach to virtually developing and validating plant-level control software of production plants. This way, plant control software can be virtually commissioned before actual ramp-up of a plant, reducing actual commissioning costs and time. Technically, this is achieved by linking the actual plant-wide process control software (often called plant server) and an elaborate virtual plant model together to form an emulation system. Method-wise, we are suggesting a four-step iterative process with well-defined increments and time frame. Our work is based on practical experiences from planning to commissioning and start-up of several cut-to-size plants.

Keywords: iterative system design, virtual plant engineering, plant control software, simulation and emulation, virtual commissioning

Procedia PDF Downloads 472
15408 Alpha: A Groundbreaking Avatar Merging User Dialogue with OpenAI's GPT-3.5 for Enhanced Reflective Thinking

Authors: Jonas Colin

Abstract:

Standing at the vanguard of AI development, Alpha represents an unprecedented synthesis of logical rigor and human abstraction, meticulously crafted to mirror the user's unique persona and personality, a feat previously unattainable in AI development. Alpha, an avant-garde artefact in the realm of artificial intelligence, epitomizes a paradigmatic shift in personalized digital interaction, amalgamating user-specific dialogic patterns with the sophisticated algorithmic prowess of OpenAI's GPT-3.5 to engender a platform for enhanced metacognitive engagement and individualized user experience. Underpinned by a sophisticated algorithmic framework, Alpha integrates vast datasets through a complex interplay of neural network models and symbolic AI, facilitating a dynamic, adaptive learning process. This integration enables the system to construct a detailed user profile, encompassing linguistic preferences, emotional tendencies, and cognitive styles, tailoring interactions to align with individual characteristics and conversational contexts. Furthermore, Alpha incorporates advanced metacognitive elements, enabling real-time reflection and adaptation in communication strategies. This self-reflective capability ensures continuous refinement of its interaction model, positioning Alpha not just as a technological marvel but as a harbinger of a new era in human-computer interaction, where machines engage with us on a deeply personal and cognitive level, transforming our interaction with the digital world.

Keywords: chatbot, GPT 3.5, metacognition, symbiose

Procedia PDF Downloads 48
15407 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 138
15406 Reconstruction of Age-Related Generations of Siberian Larch to Quantify the Climatogenic Dynamics of Woody Vegetation Close the Upper Limit of Its Growth

Authors: A. P. Mikhailovich, V. V. Fomin, E. M. Agapitov, V. E. Rogachev, E. A. Kostousova, E. S. Perekhodova

Abstract:

Woody vegetation among the upper limit of its habitat is a sensitive indicator of biota reaction to regional climate changes. Quantitative assessment of temporal and spatial changes in the distribution of trees and plant biocenoses calls for the development of new modeling approaches based upon selected data from measurements on the ground level and ultra-resolution aerial photography. Statistical models were developed for the study area located in the Polar Urals. These models allow obtaining probabilistic estimates for placing Siberian Larch trees into one of the three age intervals, namely 1-10, 11-40 and over 40 years, based on the Weilbull distribution of the maximum horizontal crown projection. Authors developed the distribution map for larch trees with crown diameters exceeding twenty centimeters by deciphering aerial photographs made by a UAV from an altitude equal to fifty meters. The total number of larches was equal to 88608, forming the following distribution row across the abovementioned intervals: 16980, 51740, and 19889 trees. The results demonstrate that two processes can be observed in the course of recent decades: first is the intensive forestation of previously barren or lightly wooded fragments of the study area located within the patches of wood, woodlands, and sparse stand, and second, expansion into mountain tundra. The current expansion of the Siberian Larch in the region replaced the depopulation process that occurred in the course of the Little Ice Age from the late 13ᵗʰ to the end of the 20ᵗʰ century. Using data from field measurements of Siberian larch specimen biometric parameters (including height, diameter at root collar and at 1.3 meters, and maximum projection of the crown in two orthogonal directions) and data on tree ages obtained at nine circular test sites, authors developed a model for artificial neural network including two layers with three and two neurons, respectively. The model allows quantitative assessment of a specimen's age based on height and maximum crone projection values. Tree height and crown diameters can be quantitatively assessed using data from aerial photographs and lidar scans. The resulting model can be used to assess the age of all Siberian larch trees. The proposed approach, after validation, can be applied to assessing the age of other tree species growing near the upper tree boundaries in other mountainous regions. This research was collaboratively funded by the Russian Ministry for Science and Education (project No. FEUG-2023-0002) and Russian Science Foundation (project No. 24-24-00235) in the field of data modeling on the basis of artificial neural network.

Keywords: treeline, dynamic, climate, modeling

Procedia PDF Downloads 45
15405 Designing Floor Planning in 2D and 3D with an Efficient Topological Structure

Authors: V. Nagammai

Abstract:

Very-large-scale integration (VLSI) is the process of creating an integrated circuit (IC) by combining thousands of transistors into a single chip. Development of technology increases the complexity in IC manufacturing which may vary the power consumption, increase the size and latency period. Topology defines a number of connections between network. In this project, NoC topology is generated using atlas tool which will increase performance in turn determination of constraints are effective. The routing is performed by XY routing algorithm and wormhole flow control. In NoC topology generation, the value of power, area and latency are predetermined. In previous work, placement, routing and shortest path evaluation is performed using an algorithm called floor planning with cluster reconstruction and path allocation algorithm (FCRPA) with the account of 4 3x3 switch, 6 4x4 switch, and 2 5x5 switches. The usage of the 4x4 and 5x5 switch will increase the power consumption and area of the block. In order to avoid the problem, this paper has used one 8x8 switch and 4 3x3 switches. This paper uses IPRCA which of 3 steps they are placement, clustering, and shortest path evaluation. The placement is performed using min – cut placement and clustering are performed using an algorithm called cluster generation. The shortest path is evaluated using an algorithm called Dijkstra's algorithm. The power consumption of each block is determined. The experimental result shows that the area, power, and wire length improved simultaneously.

Keywords: application specific noc, b* tree representation, floor planning, t tree representation

Procedia PDF Downloads 383
15404 Synthesis of Electrospun Polydimethylsiloxane (PDMS)/Polyvinylidene Fluoriure (PVDF) Nanofibrous Membranes for CO₂ Capture

Authors: Wen-Wen Wang, Qian Ye, Yi-Feng Lin

Abstract:

Carbon dioxide emissions are expected to increase continuously, resulting in climate change and global warming. As a result, CO₂ capture has attracted a large amount of research attention. Among the various CO₂ capture methods, membrane technology has proven to be highly efficient in capturing CO₂, because it can be scaled up, low energy consumptions and small area requirements for use by the gas separation. Various nanofibrous membranes were successfully prepared by a simple electrospinning process. The membrane contactor combined with chemical absorption and membrane process in the post-combustion CO₂ capture is used in this study. In a membrane contactor system, the highly porous and water-repellent nanofibrous membranes were used as a gas-liquid interface in a membrane contactor system for CO₂ absorption. In this work, we successfully prepared the polyvinylidene fluoride (PVDF) porous membranes with an electrospinning process. Afterwards, the as-prepared water-repellent PVDF porous membranes were used for the CO₂ capture application. However, the pristine PVDF nanofibrous membranes were wetted by the amine absorbents, resulting in the decrease in the CO₂ absorption flux, the hydrophobic polydimethylsiloxane (PDMS) materials were added into the PVDF nanofibrous membranes to improve the solvent resistance of the membranes. To increase the hydrophobic properties and CO₂ absorption flux, more hydrophobic surfaces of the PDMS/PVDF nanofibrous membranes are obtained by the grafting of fluoroalkylsilane (FAS) on the membranes surface. Furthermore, the highest CO₂ absorption flux of the PDMS/PVDF nanofibrous membranes is reached after the FAS modification with four times. The PDMS/PVDF nanofibrous membranes with 60 wt% PDMS addition can be a long and continuous operation of the CO₂ absorption and regeneration experiments. It demonstrates the as-prepared PDMS/PVDF nanofibrous membranes could potentially be used for large-scale CO₂ absorption during the post-combustion process in power plants.

Keywords: CO₂ capture, electrospinning process, membrane contactor, nanofibrous membranes, PDMS/PVDF

Procedia PDF Downloads 263
15403 The Social Process of Alternative Dispute Resolution and Collective Conciliation: Unveiling the Theoretical Framework

Authors: Adejoke Yemisi Ige

Abstract:

This study presents a conceptual analysis and investigation into the development of a systematic framework required for better understanding of the social process of Alternative Dispute Resolution (ADR) and collective conciliation. The critical examination presented in this study is significant because; it draws on insight from ADR, negotiation and collective bargaining literature and applies it in our advancement of a methodical outline which gives an insight into the influence of the key actors and other stakeholder strategies and behaviours during dispute resolution in relation to the outcomes which is novel. This study is qualitative and essentially inductive in nature. One of the findings of the study confirms the need to consider ADR and collective conciliation within the context of the characteristic conditions; which focus on the need for some agreement to be reached. Another finding of the study shows the extent which information-sharing, willingness of the parties to negotiate and make concession assist both parties to attain resolution. This paper recommends that in order to overcome deadlock and attain acceptable outcomes at the end of ADR and collective conciliation, the importance of information exchange and sustenance of trade union and management relationship cannot be understated. The need for trade unions and management, the representatives to achieve their expectations in order to build the confidence and assurance of their respective constituents is essential. In conclusion, the analysis presented in this study points towards a set of factors that together can be called the social process of collective conciliation nevertheless; it acknowledges that its application to collective conciliation is new.

Keywords: alternative dispute resolution, collective conciliation, social process, theoretical framework, unveiling

Procedia PDF Downloads 138
15402 Introduction to Political Psychoanalysis of a Group in the Middle East

Authors: Seyedfateh Moradi, Abas Ali Rahbar

Abstract:

The present study focuses on investigating group psychoanalysis in the Middle East. The study uses a descriptive-analytic method and library resources have been used to collect the data. Additionally, the researcher’s observations of people’s everyday behavior have played an important role in the production and analysis of the study. Group psychoanalysis in the Middle East can be conducted through people’s daily behaviors, proverbs, poetry, mythology, etc., and some of the general characteristics of people in the Middle East include: xenophobia, revivalism, fatalism, nostalgic, wills and so on. Members of the group have often failed to achieve Libido wills and it is very important in unifying and reproduction violence. Therefore, if libidinal wills are irrationally fixed, it will be important in forming fundamentalist and racist groups, a situation that is dominant among many groups in the Middle East. Adversities, from early childhood and afterwards, in the subjects have always been influential in the political behavior of group members, and it manifests itself as counter-projections. Consequently, it affects the foreign policy of the governments. On the other hand, two kinds of subjects are identifiable in the Middle East, one; classical subject that is related to nostalgia and mythology and, two; modern subjects which is self-alienated. As a result, both subjects are seeking identity and self-expression in public in relation to forming groups. Therefore, collective unconscious in the Middle East shows itself as extreme boundaries and leads to forming groups characterized with violence. Psychoanalysis shows important aspects to identify many developments in the Middle East; totally analysis of Freud, Carl Jung and Reich about groups can be applied in the present Middle East.

Keywords: political, psychoanalysis, group, Middle East

Procedia PDF Downloads 286
15401 Metareasoning Image Optimization Q-Learning

Authors: Mahasa Zahirnia

Abstract:

The purpose of this paper is to explore new and effective ways of optimizing satellite images using artificial intelligence, and the process of implementing reinforcement learning to enhance the quality of data captured within the image. In our implementation of Bellman's Reinforcement Learning equations, associated state diagrams, and multi-stage image processing, we were able to enhance image quality, detect and define objects. Reinforcement learning is the differentiator in the area of artificial intelligence, and Q-Learning relies on trial and error to achieve its goals. The reward system that is embedded in Q-Learning allows the agent to self-evaluate its performance and decide on the best possible course of action based on the current and future environment. Results show that within a simulated environment, built on the images that are commercially available, the rate of detection was 40-90%. Reinforcement learning through Q-Learning algorithm is not just desired but required design criteria for image optimization and enhancements. The proposed methods presented are a cost effective method of resolving uncertainty of the data because reinforcement learning finds ideal policies to manage the process using a smaller sample of images.

Keywords: Q-learning, image optimization, reinforcement learning, Markov decision process

Procedia PDF Downloads 197
15400 Client Hacked Server

Authors: Bagul Abhijeet

Abstract:

Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.

Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring

Procedia PDF Downloads 239
15399 Network Based Speed Synchronization Control for Multi-Motor via Consensus Theory

Authors: Liqin Zhang, Liang Yan

Abstract:

This paper addresses the speed synchronization control problem for a network-based multi-motor system from the perspective of cluster consensus theory. Each motor is considered as a single agent connected through fixed and undirected network. This paper presents an improved control protocol from three aspects. First, for the purpose of improving both tracking and synchronization performance, this paper presents a distributed leader-following method. The improved control protocol takes the importance of each motor’s speed into consideration, and all motors are divided into different groups according to speed weights. Specifically, by using control parameters optimization, the synchronization error and tracking error can be regulated and decoupled to some extent. The simulation results demonstrate the effectiveness and superiority of the proposed strategy. In practical engineering, the simplified models are unrealistic, such as single-integrator and double-integrator. And previous algorithms require the acceleration information of the leader available to all followers if the leader has a varying velocity, which is also difficult to realize. Therefore, the method focuses on an observer-based variable structure algorithm for consensus tracking, which gets rid of the leader acceleration. The presented scheme optimizes synchronization performance, as well as provides satisfactory robustness. What’s more, the existing algorithms can obtain a stable synchronous system; however, the obtained stable system may encounter some disturbances that may destroy the synchronization. Focus on this challenging technological problem, a state-dependent-switching approach is introduced. In the presence of unmeasured angular speed and unknown failures, this paper investigates a distributed fault-tolerant consensus tracking algorithm for a group non-identical motors. The failures are modeled by nonlinear functions, and the sliding mode observer is designed to estimate the angular speed and nonlinear failures. The convergence and stability of the given multi-motor system are proved. Simulation results have shown that all followers asymptotically converge to a consistent state when one follower fails to follow the virtual leader during a large enough disturbance, which illustrates the good performance of synchronization control accuracy.

Keywords: consensus control, distributed follow, fault-tolerant control, multi-motor system, speed synchronization

Procedia PDF Downloads 111
15398 An Exploration of the Dimensions of Place-Making: A South African Case Study

Authors: W. J. Strydom, K. Puren

Abstract:

Place-making is viewed here as an empowering process in which people represent, improve and maintain their spatial (natural or built) environment. With the above-mentioned in mind, place-making is multi-dimensional and include a spatial dimension (including visual properties or the end product/plan), a procedural dimension during which (negotiation/discussion of ideas with all relevant stakeholders in terms of end product/plan) and a psychological dimension (inclusion of intrinsic values and meanings related to a place in the end product/plan). These three represent dimensions of place-making. The purpose of this paper is to explore these dimensions of place-making in a case study of a local community in Ikageng, Potchefstroom, North-West Province, South Africa. This case study represents an inclusive process that strives to empower a local community (forcefully relocated due to Apartheid legislation in South Africa). This case study focussed on the inclusion of participants in the decision-making process regarding their daily environment. By means of focus group discussions and a collaborative design workshop, data is generated and ultimately creates a linkage with the theoretical dimensions of place-making. This paper contributes to the field of spatial planning due to the exploration of the dimensions of place-making and the relevancy of this process on spatial planning (especially in a South African setting).

Keywords: community engagement, place-making, planning theory, spatial planning

Procedia PDF Downloads 379