Search results for: software transactional memory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5944

Search results for: software transactional memory

5614 An Architectural Approach for the Dynamic Adaptation of Services-Based Software

Authors: Mohhamed Yassine Baroudi, Abdelkrim Benammar, Fethi Tarik Bendimerad

Abstract:

This paper proposes software architecture for dynamical service adaptation. The services are constituted by reusable software components. The adaptation’s goal is to optimize the service function of their execution context. For a first step, the context will take into account just the user needs but other elements will be added. A particular feature in our proposition is the profiles that are used not only to describe the context’s elements but also the components itself. An adapter analyzes the compatibility between all these profiles and detects the points where the profiles are not compatibles. The same Adapter search and apply the possible adaptation solutions: component customization, insertion, extraction or replacement.

Keywords: adaptative service, software component, service, dynamic adaptation

Procedia PDF Downloads 298
5613 Use of Hierarchical Temporal Memory Algorithm in Heart Attack Detection

Authors: Tesnim Charrad, Kaouther Nouira, Ahmed Ferchichi

Abstract:

In order to reduce the number of deaths due to heart problems, we propose the use of Hierarchical Temporal Memory Algorithm (HTM) which is a real time anomaly detection algorithm. HTM is a cortical learning algorithm based on neocortex used for anomaly detection. In other words, it is based on a conceptual theory of how the human brain can work. It is powerful in predicting unusual patterns, anomaly detection and classification. In this paper, HTM have been implemented and tested on ECG datasets in order to detect cardiac anomalies. Experiments showed good performance in terms of specificity, sensitivity and execution time.

Keywords: cardiac anomalies, ECG, HTM, real time anomaly detection

Procedia PDF Downloads 228
5612 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software

Authors: Anjushi Verma, Tirthankar Gayen

Abstract:

Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.

Keywords: black box, faults, failure, software reliability

Procedia PDF Downloads 443
5611 “Voiceless Memory” and Holodomor (Great Famine): The Power of Oral History to Challenge Official Historical Discourse

Authors: Tetiana Boriak

Abstract:

The study is called to test correlation between official sources, preserved in the archives, and “unofficial” oral history regarding the Great Famine of 1932–1933 in Ukraine. The research shows poor preservation of the sources, being deliberately destroyed by the totalitarian regime. It involves analysis of five stages of Holodomor oral history development. It is oral history that provides the mechanism of mass killing. The research proves that using only one type of historical sources leads to a certain line of reading history of the Holodomor, while usage of both types provides in-depth insight in the history of the famine.

Keywords: the Holodomor (the Great Famine), oral history, historical source, historical memory, totalitarianism.

Procedia PDF Downloads 108
5610 Delivering User Context-Sensitive Service in M-Commerce: An Empirical Assessment of the Impact of Urgency on Mobile Service Design for Transactional Apps

Authors: Daniela Stephanie Kuenstle

Abstract:

Complex industries such as banking or insurance experience slow growth in mobile sales. While today’s mobile applications are sophisticated and enable location based and personalized services, consumers prefer online or even face-to-face services to complete complex transactions. A possible reason for this reluctance is that the provided service within transactional mobile applications (apps) does not adequately correspond to users’ needs. Therefore, this paper examines the impact of the user context on mobile service (m-service) in m-commerce. Motivated by the potential which context-sensitive m-services hold for the future, the impact of temporal variations as a dimension of user context, on m-service design is examined. In particular, the research question asks: Does consumer urgency function as a determinant of m-service composition in transactional apps by moderating the relation between m-service type and m-service success? Thus, the aim is to explore the moderating influence of urgency on m-service types, which includes Technology Mediated Service and Technology Generated Service. While mobile applications generally comprise features of both service types, this thesis discusses whether unexpected urgency changes customer preferences for m-service types and how this consequently impacts the overall m-service success, represented by purchase intention, loyalty intention and service quality. An online experiment with a random sample of N=1311 participants was conducted. Participants were divided into four treatment groups varying in m-service types and urgency level. They were exposed to two different urgency scenarios (high/ low) and two different app versions conveying either technology mediated or technology generated service. Subsequently, participants completed a questionnaire to measure the effectiveness of the manipulation as well as the dependent variables. The research model was tested for direct and moderating effects of m-service type and urgency on m-service success. Three two-way analyses of variance confirmed the significance of main effects, but demonstrated no significant moderation of urgency on m-service types. The analysis of the gathered data did not confirm a moderating effect of urgency between m-service type and service success. Yet, the findings propose an additive effects model with the highest purchase and loyalty intention for Technology Generated Service and high urgency, while Technology Mediated Service and low urgency demonstrate the strongest effect for service quality. The results also indicate an antagonistic relation between service quality and purchase intention depending on the level of urgency. Although a confirmation of the significance of this finding is required, it suggests that only service convenience, as one dimension of mobile service quality, delivers conditional value under high urgency. This suggests a curvilinear pattern of service quality in e-commerce. Overall, the paper illustrates the complex interplay of technology, user variables, and service design. With this, it contributes to a finer-grained understanding of the relation between m-service design and situation dependency. Moreover, the importance of delivering situational value with apps depending on user context is emphasized. Finally, the present study raises the demand to continue researching the impact of situational variables on m-service design in order to develop more sophisticated m-services.

Keywords: mobile consumer behavior, mobile service design, mobile service success, self-service technology, situation dependency, user-context sensitivity

Procedia PDF Downloads 268
5609 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 402
5608 Iterative Design Process for Development and Virtual Commissioning of Plant Control Software

Authors: Thorsten Prante, Robert Schöch, Ruth Fleisch, Vaheh Khachatouri, Alexander Walch

Abstract:

The development of industrial plant control software is a complex and often very expensive task. One of the core problems is that a lot of the implementation and adaptation work can only be done after the plant hardware has been installed. In this paper, we present our approach to virtually developing and validating plant-level control software of production plants. This way, plant control software can be virtually commissioned before actual ramp-up of a plant, reducing actual commissioning costs and time. Technically, this is achieved by linking the actual plant-wide process control software (often called plant server) and an elaborate virtual plant model together to form an emulation system. Method-wise, we are suggesting a four-step iterative process with well-defined increments and time frame. Our work is based on practical experiences from planning to commissioning and start-up of several cut-to-size plants.

Keywords: iterative system design, virtual plant engineering, plant control software, simulation and emulation, virtual commissioning

Procedia PDF Downloads 488
5607 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems

Authors: Bassam Istanbouli

Abstract:

With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them.  In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies;  the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system.  Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.

Keywords: blueprint, ERP, modular, normalized

Procedia PDF Downloads 139
5606 Development of Academic Software for Medial Axis Determination of Porous Media from High-Resolution X-Ray Microtomography Data

Authors: S. Jurado, E. Pazmino

Abstract:

Determination of the medial axis of a porous media sample is a non-trivial problem of interest for several disciplines, e.g., hydrology, fluid dynamics, contaminant transport, filtration, oil extraction, etc. However, the computational tools available for researchers are limited and restricted. The primary aim of this work was to develop a series of algorithms to extract porosity, medial axis structure, and pore-throat size distributions from porous media domains. A complementary objective was to provide the algorithms as free computational software available to the academic community comprising researchers and students interested in 3D data processing. The burn algorithm was tested on porous media data obtained from High-Resolution X-Ray Microtomography (HRXMT) and idealized computer-generated domains. The real data and idealized domains were discretized in voxels domains of 550³ elements and binarized to denote solid and void regions to determine porosity. Subsequently, the algorithm identifies the layer of void voxels next to the solid boundaries. An iterative process removes or 'burns' void voxels in sequence of layer by layer until all the void space is characterized. Multiples strategies were tested to optimize the execution time and use of computer memory, i.e., segmentation of the overall domain in subdomains, vectorization of operations, and extraction of single burn layer data during the iterative process. The medial axis determination was conducted identifying regions where burnt layers collide. The final medial axis structure was refined to avoid concave-grain effects and utilized to determine the pore throat size distribution. A graphic user interface software was developed to encompass all these algorithms, including the generation of idealized porous media domains. The software allows input of HRXMT data to calculate porosity, medial axis, and pore-throat size distribution and provide output in tabular and graphical formats. Preliminary tests of the software developed during this study achieved medial axis, pore-throat size distribution and porosity determination of 100³, 320³ and 550³ voxel porous media domains in 2, 22, and 45 minutes, respectively in a personal computer (Intel i7 processor, 16Gb RAM). These results indicate that the software is a practical and accessible tool in postprocessing HRXMT data for the academic community.

Keywords: medial axis, pore-throat distribution, porosity, porous media

Procedia PDF Downloads 115
5605 Requirements Gathering for Improved Software Usability and the Potential for Usage-Centred Design

Authors: Kholod J. Alotaibi, Andrew M. Gravell

Abstract:

Usability is an important software quality that is often neglected at the design stage. Although methods exist to incorporate elements of usability engineering, there is a need for more balanced usability focused methods that can enhance the experience of software usability for users. In this regard, the potential for Usage-Centered Design is explored with respect to requirements gathering and is shown to lead to high software usability besides other benefits. It achieves this through its focus on usage, defining essential use cases, by conducting task modeling, encouraging user collaboration, refining requirements, and so on. The requirements gathering process in UgCD is described in detail.

Keywords: requirements gathering, usability, usage-centred design, computer science

Procedia PDF Downloads 358
5604 Models Development of Graphical Human Interface Using Fuzzy Logic

Authors: Érick Aragão Ribeiro, George André Pereira Thé, José Marques Soares

Abstract:

Graphical Human Interface, also known as supervision software, are increasingly present in industrial processes supported by Supervisory Control and Data Acquisition (SCADA) systems and so it is evident the need for qualified developers. In order to make engineering students able to produce high quality supervision software, method for the development must be created. In this paper we propose model, based on the international standards ISO/IEC 25010 and ISO/IEC 25040, for the development of graphical human interface. When compared with to other methods through experiments, the model here presented leads to improved quality indexes, therefore help guiding the decisions of programmers. Results show the efficiency of the models and the contribution to student learning. Students assessed the training they have received and considered it satisfactory.

Keywords: software development models, software quality, supervision software, fuzzy logic

Procedia PDF Downloads 373
5603 A Comparative Study of Language Learning Strategy Use of Iranian Kurdish Bilingual and Persian Monolingual in EFL Context

Authors: Reza Khani, Ziba Hosseini

Abstract:

This study was an attempt to investigate the difference between learners of Iranian Kurdish–Persian bilingual language and Persian monolinguals, regarding language strategy use (LLS). The participants of the study were 120 monolingual Persian and 120 bilingual Kurdish studying English as a foreign language (EFL). Data were collected using strategy inventory for language learning SILL. The results show bilingual reported higher use of language learning strategies in all categories of SILL except memory strategies.

Keywords: language learning, memory, monolingual, comparative study

Procedia PDF Downloads 403
5602 Helping Older Users Staying Connected

Authors: Q. Raza

Abstract:

Getting old is inevitable, tasks which were once simple are now a daily struggle. This paper is a study of how older users interact with web application based upon a series of experiments. The experiments conducted involved 12 participants and the experiments were split into two parts. The first set gives the users a feel of current social networks and the second set take into considerations from the participants and the results of the two are compared. This paper goes in detail on the psychological aspects such as social exclusion, Metacognition memory and Therapeutic memories and how this relates to users becoming isolated from society, social networking can be the roof on a foundation of successful computer interaction. The purpose of this paper is to carry out a study and to propose new ideas to help users to be able to use social networking sites easily and efficiently.

Keywords: cognitive psychology, special memory, social networking and human computer interaction

Procedia PDF Downloads 445
5601 Preparation vADL.net: A Software Architecture Tool with Support to All of Architectural Concepts Title

Authors: Adel Smeda, Badr Najep

Abstract:

Software architecture is a method of describing the architecture of a software system at a high level of abstraction. It represents a common abstraction of a system that stakeholders can use as a basis for mutual understanding, negotiation, consensus, and communication. It also manifests the earliest design decisions about a system, and these early bindings carry weight far out of proportion to their individual gravity with respect to the system's remaining development, its deployment, and its maintenance life, therefore it is the earliest point at which design decisions governing the system to be built can be analyzed. In this paper, we present a tool to model the architecture of software systems. It represents the first method by which system defects can be detected, and provide a clear representation of a system’s components and interactions at a high level of abstraction. It can be distinguished from other tools by its support to all software architecture elements. The tool is built using VB.net 2010. We used this tool to describe two well know systems, i.e. Capitalize and Client/Server, and the descriptions we obtained support all architectural elements of the two systems.

Keywords: software architecture, architecture description languages, modeling

Procedia PDF Downloads 466
5600 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 164
5599 Intensive Use of Software in Teaching and Learning Calculus

Authors: Nodelman V.

Abstract:

Despite serious difficulties in the assimilation of the conceptual system of Calculus, software in the educational process is used only occasionally, and even then, mainly for illustration purposes. The following are a few reasons: The non-trivial nature of the studied material, Lack of skills in working with software, Fear of losing time working with software, The variety of the software itself, the corresponding interface, syntax, and the methods of working with the software, The need to find suitable models, and familiarize yourself with working with them, Incomplete compatibility of the found models with the content and teaching methods of the studied material. This paper proposes an active use of the developed non-commercial software VusuMatica, which allows removing these restrictions through Broad support for the studied mathematical material (and not only Calculus). As a result - no need to select the right software, Emphasizing the unity of mathematics, its intrasubject and interdisciplinary relations, User-friendly interface, Absence of special syntax in defining mathematical objects, Ease of building models of the studied material and manipulating them, Unlimited flexibility of models thanks to the ability to redefine objects, which allows exploring objects characteristics, and considering examples and counterexamples of the concepts under study. The construction of models is based on an original approach to the analysis of the structure of the studied concepts. Thanks to the ease of construction, students are able not only to use ready-made models but also to create them on their own and explore the material studied with their help. The presentation includes examples of using VusuMatica in studying the concepts of limit and continuity of a function, its derivative, and integral.

Keywords: counterexamples, limitations and requirements, software, teaching and learning calculus, user-friendly interface and syntax

Procedia PDF Downloads 81
5598 A Case Study of Open Source Development Practices within a Large Company Setting

Authors: Alma Orucevic-Alagic, Martin Höst

Abstract:

Open source communities have demonstrated that complex and enterprise grade software can be produced, supported, and maintained by self-organizing groups of developers using primarily electronic form of communication. Due to the inherent nature of open source development, a specific set of open source software development practices has evolved. While there is an ongoing research on the topic of applicability of open source development practices within a company setting, still little is known about their benefits and challenges. The objective of this research is to understand if and to what degree open source development practices observed within a mature open source community are aligned with development practices within a large software and hardware company setting. For the purpose of this case study a set of open source development practices that are present in a mature open source community has been identified. Then, development practices of a large, international, hardware and software company based in Sweden were assessed and compared to the identified open source community practices. It is shown that there are many similarities between a mature open source community and a large company setting in regard to software development practices. We also identify practices that exist in open source communities and that are not standard within a company setting, but whose implementation can result in an improved software development efficiency within the company setting.

Keywords: development practices, open source software, innersource, closed open source

Procedia PDF Downloads 558
5597 Forensic Analysis of Signal Messenger on Android

Authors: Ward Bakker, Shadi Alhakimi

Abstract:

The amount of people moving towards more privacy focused instant messaging applications has grown significantly. Signal is one of these instant messaging applications, which makes Signal interesting for digital investigators. In this research, we evaluate the artifacts that are generated by the Signal messenger for Android. This evaluation was done by using the features that Signal provides to create artifacts, whereafter, we made an image of the internal storage and the process memory. This image was analysed manually. The manual analysis revealed the content that Signal stores in different locations during its operation. From our research, we were able to identify the artifacts and interpret how they were used. We also examined the source code of Signal. Using our obtain knowledge from the source code, we developed a tool that decrypts some of the artifacts using the key stored in the Android Keystore. In general, we found that most artifacts are encrypted and encoded, even after decrypting some of the artifacts. During data visualization, some artifacts were found, such as that Signal does not use relationships between the data. In this research, two interesting groups of artifacts were identified, those related to the database and those stored in the process memory dump. In the database, we found plaintext private- and group chats, and in the memory dump, we were able to retrieve the plaintext access code to the application. Nevertheless, we conclude that Signal contains a wealth of artifacts that could be very valuable to a digital forensic investigation.

Keywords: forensic, signal, Android, digital

Procedia PDF Downloads 82
5596 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition

Authors: Jacqueline Żammit

Abstract:

Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.

Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities

Procedia PDF Downloads 61
5595 A Gamification Teaching Method for Software Measurement Process

Authors: Lennon Furtado, Sandro Oliveira

Abstract:

The importance of an effective measurement program lies in the ability to control and predict what can be measured. Thus, the measurement program has the capacity to provide bases in decision-making to support the interests of an organization. Therefore, it is only possible to apply for an effective measurement program with a team of software engineers well trained in the measurement area. However, the literature indicates that are few computer science courses that have in their program the teaching of the software measurement process. And even these, generally present only basic theoretical concepts of said process and little or no measurement in practice, which results in the student's lack of motivation to learn the measurement process. In this context, according to some experts in software process improvements, one of the most used approaches to maintaining the motivation and commitment to software process improvements program is the use of the gamification. Therefore, this paper aims to present a proposal of teaching the measurement process by gamification. Which seeks to improve student motivation and performance in the assimilation of tasks related to software measurement, by incorporating elements of games into the practice of measurement process, making it more attractive for learning. And as a way of validating the proposal will be made a comparison between two distinct groups of 20 students of Software Quality class, a control group, and an experiment group. The control group will be the students that will not make use of the gamification proposal to learn software measurement process, while the experiment group, will be the students that will make use of the gamification proposal to learn software measurement process. Thus, this paper will analyze the objective and subjective results of each group. And as objective result will be analyzed the student grade reached at the end of the course, and as subjective results will be analyzed a post-course questionnaire with the opinion of each student about the teaching method. Finally, this paper aims to prove or refute the following hypothesis: If the gamification proposal to teach software measurement process does appropriate motivate the student, in order to attribute the necessary competence to the practical application of the measurement process.

Keywords: education, gamification, software measurement process, software engineering

Procedia PDF Downloads 314
5594 A Robust Software for Advanced Analysis of Space Steel Frames

Authors: Viet-Hung Truong, Seung-Eock Kim

Abstract:

This paper presents a robust software package for practical advanced analysis of space steel framed structures. The pre- and post-processors of the presented software package are coded in the C++ programming language while the solver is written by using the FORTRAN programming language. A user-friendly graphical interface of the presented software is developed to facilitate the modeling process and result interpretation of the problem. The solver employs the stability functions for capturing the second-order effects to minimize modeling and computational time. Both the plastic-hinge and fiber-hinge beam-column elements are available in the presented software. The generalized displacement control method is adopted to solve the nonlinear equilibrium equations.

Keywords: advanced analysis, beam-column, fiber-hinge, plastic hinge, steel frame

Procedia PDF Downloads 307
5593 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 72
5592 Effects of Partial Sleep Deprivation on Prefrontal Cognitive Functions in Adolescents

Authors: Nurcihan Kiris

Abstract:

Restricted sleep is common in young adults and adolescents. The results of a few objective studies of sleep deprivation on cognitive performance were not clarified. In particular, the effect of sleep deprivation on the cognitive functions associated with frontal lobe such as attention, executive functions, working memory is not well known. The aim of this study is to investigate the effect of partial sleep deprivation experimentally in adolescents on the cognitive tasks of frontal lobe including working memory, strategic thinking, simple attention, continuous attention, executive functions, and cognitive flexibility. Subjects of the study were recruited from voluntary students of Cukurova University. Eighteen adolescents underwent four consecutive nights of monitored sleep restriction (6–6.5 hr/night) and four nights of sleep extension (10–10.5 hr/night), in counterbalanced order, and separated by a washout period. Following each sleep period, cognitive performance was assessed, at a fixed morning time, using a computerized neuropsychological battery based on frontal lobe functions task, a timed test providing both accuracy and reaction time outcome measures. Only the spatial working memory performance of cognitive tasks was found to be statistically lower in a restricted sleep condition than the extended sleep condition. On the other hand, there was no significant difference in the performance of cognitive tasks evaluating simple attention, constant attention, executive functions, and cognitive flexibility. It is thought that especially the spatial working memory and strategic thinking skills of adolescents may be susceptible to sleep deprivation. On the other hand, adolescents are predicted to be optimally successful in ideal sleep conditions, especially in the circumstances requiring for the short term storage of visual information, processing of stored information, and strategic thinking. The findings of this study may also be associated with possible negative functional effects on the processing of academic social and emotional inputs in adolescents for partial sleep deprivation. Acknowledgment: This research was supported by Cukurova University Scientific Research Projects Unit.

Keywords: attention, cognitive functions, sleep deprivation, working memory

Procedia PDF Downloads 155
5591 Improving Software Technology to Support Release Process in Global Software Development Environment: An Experience Report

Authors: Hualter Barbosa, Bruno Bonifacio

Abstract:

The process of globalization and new business has transformed the dynamics of software development. To meet the new demands, the software industry has adapted new methodologies that can shorten development cycles to ensure greater competitiveness. Given this scenario, Global Software Development (GSD) has become a strategic element for new products' success. However, the reliability, opportunity, and perceived value can be influenced substantially with the automation of steps in the development process activities. In this sense, the development of new technologies can help developers and managers to improve the quality of development. This paper presents a report on improving one of the release process activities of Sidia's mobile product area using software technology. The objective is to present the improvement of the CLCATCH tool developed based on experimental studies and qualitative analysis on the points of improvement for the release process in Android update projects for Samsung mobile devices. The results show improvement for the new version and approach of the tool, with points that can facilitate new features of the proposed technology.

Keywords: Android updated, empirical studies, GSD, process improvement

Procedia PDF Downloads 142
5590 The Desire for Significance & Memorability in Popular Culture: A Cognitive Psychological Study of Contemporary Literature, Art, and Media

Authors: Israel B. Bitton

Abstract:

“Memory” is associated with various phenomena, from physical to mental, personal to collective and historical to cultural. As part of a broader exploration of memory studies in philosophy and science (slated for academic publication October 2021), this specific study employs analytical methods of cognitive psychology and philosophy of memory to theorize that A) the primary human will (drive) is to significance, in that every human action and expression can be rooted in a most primal desire to be cosmically significant (however that is individually perceived); and B) that the will to significance manifests as the will to memorability, an innate desire to be remembered by others after death. In support of these broad claims, a review of various popular culture “touchpoints”—historic and contemporary records spanning literature, film and television, traditional news media, and social media—is presented to demonstrate how this very theory is repeatedly and commonly expressed (and has been for a long time) by many popular public figures as well as “everyday people.” Though developed before COVID, the crisis only increased the theory’s relevance: so many people were forced to die alone, leaving them and their loved ones to face even greater existential angst than what ordinarily accompanies death since the usual expectations for one’s “final moments” were shattered. To underscore this issue of, and response to, what can be considered a sociocultural “memory gap,” this study concludes with a summary of several projects launched by journalists at the height of the pandemic to document the memorable human stories behind COVID’s tragic warped speed death toll that, when analyzed through the lens of Viktor E. Frankl’s psychoanalytical perspective on “existential meaning,” shows how countless individuals were robbed of the last wills and testaments to their self-significance and memorability typically afforded to the dying and the aggrieved. The resulting insight ought to inform how government and public health officials determine what is truly “non-essential” to human health, physical and mental, at times of crisis.

Keywords: cognitive psychology, covid, neuroscience, philosophy of memory

Procedia PDF Downloads 184
5589 The Development of Portable Application Software for Cardiovascular Fitness Norms of NDUM Cadet Students

Authors: Mohar Kassim, Hardy Azmir, Rahmat Sholihin Mokhtar

Abstract:

The purpose of this study is to build portable application software to determine the level of cardiovascular fitness for cadet students of the National Defence University of Malaysia (NDUM). Fitness in the context of this study refers to physical fitness, specifically the cardiovascular endurance level test battery in the form of a 2.4 km run test for UPNM cadet students. This run test will be conducted to measure, test, and evaluate the performance of UPNM cadet students. All the run test results can be recorded electronically inside the portable software and will later be able to show the level of cardiovascular fitness of every cadet student according to age and gender. This software can also calculate the body mass index (BMI). Normative survey method will be used in this study through the analysis of the 2.4 km run test results. The run test scores will be classified in interval and ratio scales. Based on the findings of this study, portable application software will produced. The software will be able to directly assist the Military Training Academy (ALK), Malaysian Armed Forces (ATM), and other relevant agencies in determining the level of cardiovascular fitness among their staff. The test can be done electronically and on portable mode. The next step to be taken is to have this application patented.

Keywords: development, software, application, portable, fitness norms, cardiovascular endurance

Procedia PDF Downloads 549
5588 Oleic Acid Enhances Hippocampal Synaptic Efficacy

Authors: Rema Vazhappilly, Tapas Das

Abstract:

Oleic acid is a cis unsaturated fatty acid and is known to be a partially essential fatty acid due to its limited endogenous synthesis during pregnancy and lactation. Previous studies have demonstrated the role of oleic acid in neuronal differentiation and brain phospholipid synthesis. These evidences indicate a major role for oleic acid in learning and memory. Interestingly, oleic acid has been shown to enhance hippocampal long term potentiation (LTP), the physiological correlate of long term synaptic plasticity. However the effect of oleic acid on short term synaptic plasticity has not been investigated. Short term potentiation (STP) is the physiological correlate of short term synaptic plasticity which is the key underlying molecular mechanism of short term memory and neuronal information processing. STP in the hippocampal CA1 region has been known to require the activation of N-methyl-D-aspartate receptors (NMDARs). The NMDAR dependent hippocampal STP as a potential mechanism for short term memory has been a subject of intense interest for the past few years. Therefore in the present study the effect of oleic acid on NMDAR dependent hippocampal STP was determined in mouse hippocampal slices (in vitro) using Multi-electrode array system. STP was induced by weak tetanic Stimulation (one train of 100 Hz stimulations for 0.1s) of the Schaffer collaterals of CA1 region of the hippocampus in slices treated with different concentrations of oleic acid in presence or absence of NMDAR antagonist D-AP5 (30 µM) . Oleic acid at 20 (mean increase in fEPSP amplitude = ~135 % Vs. Control = 100%; P<0.001) and 30 µM (mean increase in fEPSP amplitude = ~ 280% Vs. Control = 100%); P<0.001) significantly enhanced the STP following weak tetanic stimulation. Lower oleic acid concentrations at 10 µM did not modify the hippocampal STP induced by weak tetanic stimulation. The hippocampal STP induced by weak tetanic stimulation was completely blocked by the NMDA receptor antagonist D-AP5 (30µM) in both oleic acid and control treated hippocampal slices. This lead to the conclusion that the hippocampal STP elicited by weak tetanic stimulation and enhanced by oleic acid was NMDAR dependent. Together these findings suggest that oleic acid may enhance the short term memory and neuronal information processing through the modulation of NMDAR dependent hippocampal short-term synaptic plasticity. In conclusion this study suggests the possible role of oleic acid to prevent the short term memory loss and impaired neuronal function throughout development.

Keywords: oleic acid, short-term potentiation, memory, field excitatory post synaptic potentials, NMDA receptor

Procedia PDF Downloads 335
5587 Stakeholder Management for Successful Software Projects

Authors: Kassem Saleh

Abstract:

An alarming number of software projects fail to deliver the required functionalities within the provided budget and timeframe and with the required qualities. Some of the main reasons for this problem include bad stakeholder management, poor communications and informal change management. Informal processes to identify, engage and control stakeholders lead to these reasons. Recently, to emphasize its importance, the Project Management Institute (PMI) updated the Project Management Body of Knowledge (PMBoK) to explicitly include the stakeholder management knowledge area. This knowledge area consists of four processes to identify stakeholders, plan stakeholder management, and manage and control stakeholder engagement. The use of appropriate techniques for stakeholder management in software projects will definitely lead to higher quality and successful software. In this paper, we describe some of the proven techniques that can be used during the execution of the four processes for stakeholder management. Development of collaboration tools for automating these processes are recommended and need to be integrated in available software project management tools.

Keywords: project management, stakeholder management, software development, project management body of knowledge

Procedia PDF Downloads 311
5586 On Crack Tip Stress Field in Pseudo-Elastic Shape Memory Alloys

Authors: Gulcan Ozerim, Gunay Anlas

Abstract:

In shape memory alloys, upon loading, stress increases around crack tip and a martensitic phase transformation occurs in early stages. In many studies the stress distribution in the vicinity of the crack tip is represented by using linear elastic fracture mechanics (LEFM) although the pseudo-elastic behavior results in a nonlinear stress-strain relation. In this study, the HRR singularity (Hutchinson, Rice and Rosengren), that uses Rice’s path independent J-integral, is tried to formulate the stress distribution around the crack tip. In HRR approach, the Ramberg-Osgood model for the stress-strain relation of power-law hardening materials is used to represent the elastic-plastic behavior. Although it is recoverable, the inelastic portion of the deformation in martensitic transformation (up to the end of transformation) resembles to that of plastic deformation. To determine the constants of the Ramberg-Osgood equation, the material’s response is simulated in ABAQUS using a UMAT based on ZM (Zaki-Moumni) thermo-mechanically coupled model, and the stress-strain curve of the material is plotted. An edge cracked shape memory alloy (Nitinol) plate is loaded quasi-statically under mode I and modeled using ABAQUS; the opening stress values ahead of the cracked tip are calculated. The stresses are also evaluated using the asymptotic equations of both LEFM and HRR. The results show that in the transformation zone around the crack tip, the stress values are much better represented when the HRR singularity is used although the J-integral does not show path independent behavior. For the nodes very close to the crack tip, the HRR singularity is not valid due to the non-proportional loading effect and high-stress values that go beyond the transformation finish stress.

Keywords: crack, HRR singularity, shape memory alloys, stress distribution

Procedia PDF Downloads 325
5585 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays

Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal

Abstract:

Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).

Keywords: fault tolerance, FPGA, single event upset, approximate computing

Procedia PDF Downloads 198