Search results for: cognitive complexity metric
2289 Effectiveness of Gamified Virtual Physiotherapy Patients with Shoulder Problems
Authors: A. Barratt, M. H. Granat, S. Buttress, B. Roy
Abstract:
Introduction: Physiotherapy is an essential part of the treatment of patients with shoulder problems. The focus of treatment is usually centred on addressing specific physiotherapy goals, ultimately resulting in the improvement in pain and function. This study investigates if computerised physiotherapy using gamification principles are as effective as standard physiotherapy. Methods: Physiotherapy exergames were created using a combination of commercially available hardware, the Microsoft Kinect, and bespoke software. The exergames used were validated by mapping physiotherapy goals of physiotherapy which included; strength, range of movement, control, speed, and activation of the kinetic chain. A multicenter, randomised prospective controlled trial investigated the use of exergames on patients with Shoulder Impingement Syndrome who had undergone Arthroscopic Subacromial Decompression surgery. The intervention group was provided with the automated sensor-based technology, allowing them to perform exergames and track their rehabilitation progress. The control group was treated with standard physiotherapy protocols. Outcomes from different domains were used to compare the groups. An important metric was the assessment of shoulder range of movement pre- and post-operatively. The range of movement data included abduction, forward flexion and external rotation which were measured by the software, pre-operatively, 6 weeks and 12 weeks post-operatively. Results: Both groups show significant improvement from pre-operative to 12 weeks in elevation in forward flexion and abduction planes. Results for abduction showed an improvement for the interventional group (p < 0.015) as well as the test group (p < 0.003). Forward flexion improvement was interventional group (p < 0.0201) with the control group (p < 0.004). There was however no significant difference between the groups at 12 weeks for abduction (p < 0.118067) , forward flexion (p < 0.189755) or external rotation (p < 0.346967). Conclusion: Exergames may be used as an alternative to standard physiotherapy regimes; however, further analysis is required focusing on patient engagement.Keywords: shoulder, physiotherapy, exergames, gamification
Procedia PDF Downloads 1982288 Preparation of Metal Containing Epoxy Polymer and Investigation of Their Properties as Fluorescent Probe
Authors: Ertuğ Yıldırım, Dile Kara, Salih Zeki Yıldız
Abstract:
Metal containing polymers (MCPs) are macro molecules usually containing metal-ligand coordination units and are a multidisciplinary research field mainly based at the interface between coordination chemistry and polymer science. The progress of this area has also been reinforced by the growth of several other closely related disciplines including macro molecular engineering, crystal engineering, organic synthesis, supra molecular chemistry and colloidal and material science. Schiff base ligands are very effective in constructing supra molecular architectures such as coordination polymers, double helical and triple helical complexes. In addition, Schiff base derivatives incorporating a fluorescent moiety are appealing tools for optical sensing of metal ions. MCPs are well-known systems in which the combinations of local parameters are possible by means of fluoro metric techniques. Generally, without incorporation of the fluorescent groups with polymers is unspecific, and it is not useful to analyze their fluorescent properties. Therefore, it is necessary to prepare a new type epoxy polymers with fluorescent groups in terms of metal sensing prop and the other photo chemical applications. In the present study metal containing polymers were prepared via poly functional monomeric Schiff base metal chelate complexes in the presence of dis functional monomers such as diglycidyl ether Bisphenol A (DGEBA). The synthesized complexes and polymers were characterized by FTIR, UV-VIS and mass spectroscopies. The preparations of epoxy polymers have been carried out at 185 °C. The prepared composites having sharp and narrow excitation/emission properties are expected to be applicable in various systems such as heat-resistant polymers and photo voltaic devices. The prepared composite is also ideal for various applications, easily prepared, safe, and maintain good fluorescence properties.Keywords: Schiff base ligands, crystal engineering, fluorescence properties, Metal Containing Polymers (MCPs)
Procedia PDF Downloads 3502287 Refactoring Object Oriented Software through Community Detection Using Evolutionary Computation
Authors: R. Nagarani
Abstract:
An intrinsic property of software in a real-world environment is its need to evolve, which is usually accompanied by the increase of software complexity and deterioration of software quality, making software maintenance a tough problem. Refactoring is regarded as an effective way to address this problem. Many refactoring approaches at the method and class level have been proposed. But the extent of research on software refactoring at the package level is less. This work presents a novel approach to refactor the package structures of object oriented software using genetic algorithm based community detection. It uses software networks to represent classes and their dependencies. It uses a constrained community detection algorithm to obtain the optimized community structures in software networks, which also correspond to the optimized package structures. It finally provides a list of classes as refactoring candidates by comparing the optimized package structures with the real package structures.Keywords: community detection, complex network, genetic algorithm, package, refactoring
Procedia PDF Downloads 4212286 An Experimental Study of Scalar Implicature Processing in Chinese
Authors: Liu Si, Wang Chunmei, Liu Huangmei
Abstract:
A prominent component of the semantic versus pragmatic debate, scalar implicature (SI) has been gaining great attention ever since it was proposed by Horn. The constant debate is between the structural and pragmatic approach. The former claims that generation of SI is costless, automatic, and dependent mostly on the structural properties of sentences, whereas the latter advocates both that such generation is largely dependent upon context, and that the process is costly. Many experiments, among which Katsos’s text comprehension experiments are influential, have been designed and conducted in order to verify their views, but the results are not conclusive. Besides, most of the experiments were conducted in English language materials. Katsos conducted one off-line and three on-line text comprehension experiments, in which the previous shortcomings were addressed on a certain extent and the conclusion was in favor of the pragmatic approach. We intend to test the results of Katsos’s experiment in Chinese scalar implicature. Four experiments in both off-line and on-line conditions to examine the generation and response time of SI in Chinese "yixie" (some) and "quanbu (dou)" (all) will be conducted in order to find out whether the structural or the pragmatic approach could be sustained. The study mainly aims to answer the following questions: (1) Can SI be generated in the upper- and lower-bound contexts as Katsos confirmed when Chinese language materials are used in the experiment? (2) Can SI be first generated, then cancelled as default view claimed or can it not be generated in a neutral context when Chinese language materials are used in the experiment? (3) Is SI generation costless or costly in terms of processing resources? (4) In line with the SI generation process, what conclusion can be made about the cognitive processing model of language meaning? Is it a parallel model or a linear model? Or is it a dynamic and hierarchical model? According to previous theoretical debates and experimental conflicts, presumptions could be made that SI, in Chinese language, might be generated in the upper-bound contexts. Besides, the response time might be faster in upper-bound than that found in lower-bound context. SI generation in neutral context might be the slowest. At last, a conclusion would be made that the processing model of SI could not be verified by either absolute structural or pragmatic approaches. It is, rather, a dynamic and complex processing mechanism, in which the interaction of language forms, ad hoc context, mental context, background knowledge, speakers’ interaction, etc. are involved.Keywords: cognitive linguistics, pragmatics, scalar implicture, experimental study, Chinese language
Procedia PDF Downloads 3642285 The Impact of Task Type and Group Size on Dialogue Argumentation between Students
Authors: Nadia Soledad Peralta
Abstract:
Within the framework of socio-cognitive interaction, argumentation is understood as a psychological process that supports and induces reasoning and learning. Most authors emphasize the great potential of argumentation to negotiate with contradictions and complex decisions. So argumentation is a target for researchers who highlight the importance of social and cognitive processes in learning. In the context of social interaction among university students, different types of arguments are analyzed according to group size (dyads and triads) and the type of task (reading of frequency tables, causal explanation of physical phenomena, the decision regarding moral dilemma situations, and causal explanation of social phenomena). Eighty-nine first-year social sciences students of the National University of Rosario participated. Two groups were formed from the results of a pre-test that ensured the heterogeneity of points of view between participants. Group 1 consisted of 56 participants (performance in dyads, total: 28), and group 2 was formed of 33 participants (performance in triads, total: 11). A quasi-experimental design was performed in which effects of the two variables (group size and type of task) on the argumentation were analyzed. Three types of argumentation are described: authentic dialogical argumentative resolutions, individualistic argumentative resolutions, and non-argumentative resolutions. The results indicate that individualistic arguments prevail in dyads. That is, although people express their own arguments, there is no authentic argumentative interaction. Given that, there are few reciprocal evaluations and counter-arguments in dyads. By contrast, the authentically dialogical argument prevails in triads, showing constant feedback between participants’ points of view. It was observed that, in general, the type of task generates specific types of argumentative interactions. However, it is possible to emphasize that the authentically dialogic arguments predominate in the logical tasks, whereas the individualists or pseudo-dialogical are more frequent in opinion tasks. Nerveless, these relationships between task type and argumentative mode are best clarified in an interactive analysis based on group size. Finally, it is important to stress the value of dialogical argumentation in educational domains. Argumentative function not only allows a metacognitive reflection about their own point of view but also allows people to benefit from exchanging points of view in interactive contexts.Keywords: sociocognitive interaction, argumentation, university students, size of the grup
Procedia PDF Downloads 852284 Maintenance Performance Measurement Derived Optimization: A Case Study
Authors: James M. Wakiru, Liliane Pintelon, Peter Muchiri, Stanley Mburu
Abstract:
Maintenance performance measurement (MPM) represents an integrated aspect that considers both operational and maintenance related aspects while evaluating the effectiveness and efficiency of maintenance to ensure assets are working as they should. Three salient issues require to be addressed for an asset-intensive organization to employ an MPM-based framework to optimize maintenance. Firstly, the organization should establish important perfomance metric(s), in this case the maintenance objective(s), which they will be focuss on. The second issue entails aligning the maintenance objective(s) with maintenance optimization. This is achieved by deriving maintenance performance indicators that subsequently form an objective function for the optimization program. Lastly, the objective function is employed in an optimization program to derive maintenance decision support. In this study, we develop a framework that initially identifies the crucial maintenance performance measures, and employs them to derive maintenance decision support. The proposed framework is demonstrated in a case study of a geothermal drilling rig, where the objective function is evaluated utilizing a simulation-based model whose parameters are derived from empirical maintenance data. Availability, reliability and maintenance inventory are depicted as essential objectives requiring further attention. A simulation model is developed mimicking a drilling rig operations and maintenance where the sub-systems are modelled undergoing imperfect maintenance, corrective (CM) and preventive (PM), with the total cost as the primary performance measurement. Moreover, three maintenance spare inventory policies are considered; classical (retaining stocks for a contractual period), vendor-managed inventory with consignment stock and periodic monitoring order-to-stock (s, S) policy. Optimization results infer that the adoption of (s, S) inventory policy, increased PM interval and reduced reliance of CM actions offers improved availability and total costs reduction.Keywords: maintenance, vendor-managed, decision support, performance, optimization
Procedia PDF Downloads 1272283 Assessment of the Validity of Sentiment Analysis as a Tool to Analyze the Emotional Content of Text
Authors: Trisha Malhotra
Abstract:
Sentiment analysis is a recent field of study that computationally assesses the emotional nature of a body of text. To assess its test-validity, sentiment analysis was carried out on the emotional corpus of text from a personal 15-day mood diary. Self-reported mood scores varied more or less accurately with daily mood evaluation score given by the software. On further assessment, it was found that while sentiment analysis was good at assessing ‘global’ mood, it was not able to ‘locally’ identify and differentially score synonyms of various emotional words. It is further critiqued for treating the intensity of an emotion as universal across cultures. Finally, the software is shown not to account for emotional complexity in sentences by treating emotions as strictly positive or negative. Hence, it is posited that a better output could be two (positive and negative) affect scores for the same body of text.Keywords: analysis, data, diary, emotions, mood, sentiment
Procedia PDF Downloads 2702282 Design and Implementation of Testable Reversible Sequential Circuits Optimized Power
Authors: B. Manikandan, A. Vijayaprabhu
Abstract:
The conservative reversible gates are used to designed reversible sequential circuits. The sequential circuits are flip-flops and latches. The conservative logic gates are Feynman, Toffoli, and Fredkin. The design of two vectors testable sequential circuits based on conservative logic gates. All sequential circuit based on conservative logic gates can be tested for classical unidirectional stuck-at faults using only two test vectors. The two test vectors are all 1s, and all 0s. The designs of two vectors testable latches, master-slave flip-flops and double edge triggered (DET) flip-flops are presented. We also showed the application of the proposed approach toward 100% fault coverage for single missing/additional cell defect in the quantum- dot cellular automata (QCA) layout of the Fredkin gate. The conservative logic gates are in terms of complexity, speed, and area.Keywords: DET, QCA, reversible logic gates, POS, SOP, latches, flip flops
Procedia PDF Downloads 3072281 Discovering Causal Structure from Observations: The Relationships between Technophile Attitude, Users Value and Use Intention of Mobility Management Travel App
Authors: Aliasghar Mehdizadeh Dastjerdi, Francisco Camara Pereira
Abstract:
The increasing complexity and demand of transport services strains transportation systems especially in urban areas with limited possibilities for building new infrastructure. The solution to this challenge requires changes of travel behavior. One of the proposed means to induce such change is multimodal travel apps. This paper describes a study of the intention to use a real-time multi-modal travel app aimed at motivating travel behavior change in the Greater Copenhagen Region (Denmark) toward promoting sustainable transport options. The proposed app is a multi-faceted smartphone app including both travel information and persuasive strategies such as health and environmental feedback, tailoring travel options, self-monitoring, tunneling users toward green behavior, social networking, nudging and gamification elements. The prospective for mobility management travel apps to stimulate sustainable mobility rests not only on the original and proper employment of the behavior change strategies, but also on explicitly anchoring it on established theoretical constructs from behavioral theories. The theoretical foundation is important because it positively and significantly influences the effectiveness of the system. However, there is a gap in current knowledge regarding the study of mobility-management travel app with support in behavioral theories, which should be explored further. This study addresses this gap by a social cognitive theory‐based examination. However, compare to conventional method in technology adoption research, this study adopts a reverse approach in which the associations between theoretical constructs are explored by Max-Min Hill-Climbing (MMHC) algorithm as a hybrid causal discovery method. A technology-use preference survey was designed to collect data. The survey elicited different groups of variables including (1) three groups of user’s motives for using the app including gain motives (e.g., saving travel time and cost), hedonic motives (e.g., enjoyment) and normative motives (e.g., less travel-related CO2 production), (2) technology-related self-concepts (i.e. technophile attitude) and (3) use Intention of the travel app. The questionnaire items led to the formulation of causal relationships discovery to learn the causal structure of the data. Causal relationships discovery from observational data is a critical challenge and it has applications in different research fields. The estimated causal structure shows that the two constructs of gain motives and technophilia have a causal effect on adoption intention. Likewise, there is a causal relationship from technophilia to both gain and hedonic motives. In line with the findings of the prior studies, it highlights the importance of functional value of the travel app as well as technology self-concept as two important variables for adoption intention. Furthermore, the results indicate the effect of technophile attitude on developing gain and hedonic motives. The causal structure shows hierarchical associations between the three groups of user’s motive. They can be explained by “frustration-regression” principle according to Alderfer's ERG (Existence, Relatedness and Growth) theory of needs meaning that a higher level need remains unfulfilled, a person may regress to lower level needs that appear easier to satisfy. To conclude, this study shows the capability of causal discovery methods to learn the causal structure of theoretical model, and accordingly interpret established associations.Keywords: travel app, behavior change, persuasive technology, travel information, causality
Procedia PDF Downloads 1442280 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos
Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog
Abstract:
Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.Keywords: vaccinated, unvaccinated, socoal distancing, filipinos
Procedia PDF Downloads 2042279 Application of Blockchain on Manufacturing Process Control and Pricing Policy
Authors: Chieh Lee
Abstract:
Today, supply chain managers face extensive disruptions in raw material pricing, transportation block, and quality issue due to product complexity. While digitalization might help managers to mitigate the disruption risk and increase supply chain resilience by sharing information between sellers and buyers through the supply chain, entities are reluctant to build such a system. The main reason is it is not clear what information should be shared and who has access to the stored information. In this research, we propose a smart contract built by blockchain technology. This contract helps both buyer and seller to identify the type of information, the access to the information, and how to trace the information. This contract helps managers control their orders through the supply chain and address any disruption they see fit. Furthermore, with the same smart contract, the supplier can track the production process of an order and increase production efficiency by eliminating waste.Keywords: blockchain, production process, smart contract, supply chain resilience
Procedia PDF Downloads 812278 Comparison of Nucleic Acid Extraction Platforms On Tissue Samples
Authors: Siti Rafeah Md Rafei, Karen Wang Yanping, Park Mi Kyoung
Abstract:
Tissue samples are precious supply for molecular studies or disease identification diagnosed using molecular assays, namely real-time PCR (qPCR). It is critical to establish the most favorable nucleic acid extraction that gives the PCR-amplifiable genomic DNA. Furthermore, automated nucleic acid extraction is an appealing alternative to labor-intensive manual methods. Operational complexity, defined as the number of steps required to obtain an extracted sample, is one of the criteria in the comparison. Here we are comparing the One BioMed’s automated X8 platform with the commercially available manual-operated kits from QIAGEN Mini Kit and Roche. We extracted DNA from rat fresh-frozen tissue (from different type of organs) in the matrices. After tissue pre-treatment, it is added to the One BioMed’s X8 pre-filled cartridge, and the QIAGEN QIAmp column respectively. We found that the results after subjecting the eluates to the Real Time PCR using BIORAD CFX are comparable.Keywords: DNA extraction, frozen tissue, PCR, qPCR, rat
Procedia PDF Downloads 1632277 An Embedded High Speed Adder for Arithmetic Computations
Authors: Kala Bharathan, R. Seshasayanan
Abstract:
In this paper, a 1-bit Embedded Logic Full Adder (EFA) circuit in transistor level is proposed, which reduces logic complexity, gives low power and high speed. The design is further extended till 64 bits. To evaluate the performance of EFA, a 16, 32, 64-bit both Linear and Square root Carry Select Adder/Subtractor (CSLAS) Structure is also proposed. Realistic testing of proposed circuits is done on 8 X 8 Modified Booth multiplier and comparison in terms of power and delay is done. The EFA is implemented for different multiplier architectures for performance parameter comparison. Overall delay for CSLAS is reduced to 78% when compared to conventional one. The circuit implementations are done on TSMC 28nm CMOS technology using Cadence Virtuoso tool. The EFA has power savings of up to 14% when compared to the conventional adder. The present implementation was found to offer significant improvement in terms of power and speed in comparison to other full adder circuits.Keywords: embedded logic, full adder, pdp, xor gate
Procedia PDF Downloads 4492276 Bandwidth Control Using Reconfigurable Antenna Elements
Authors: Sudhina H. K, Ravi M. Yadahalli, N. M. Shetti
Abstract:
Reconfigurable antennas represent a recent innovation in antenna design that changes from classical fixed-form, Fixed function antennas to modifiable structures that can be adapted to fit the requirements of a time varying system. The ability to control the operating band of an antenna system can have many useful applications. Systems that operate in an acquire-and-track configuration would see a benefit from active bandwidth control. In such systems a wide band search mode is first employed to find a desired signal, Then a narrow band track mode is used to follow only that signal. Utilizing active antenna bandwidth control, A single antenna would function for both the wide band and narrow band configurations providing the rejection of unwanted signals with the antenna hardware. This ability to move a portion of the RF filtering out of the receiver and onto the antenna itself will also aid in reducing the complexity of the often expensive RF processing subsystems.Keywords: designing methods, mems, stack, reconfigurable elements
Procedia PDF Downloads 2732275 Static vs. Stream Mining Trajectories Similarity Measures
Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh
Abstract:
Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining
Procedia PDF Downloads 3962274 Towards a Computational Model of Consciousness: Global Abstraction Workspace
Authors: Halim Djerroud, Arab Ali Cherif
Abstract:
We assume that conscious functions are implemented automatically. In other words that consciousness as well as the non-consciousness aspect of human thought, planning, and perception, are produced by biologically adaptive algorithms. We propose that the mechanisms of consciousness can be produced using similar adaptive algorithms to those executed by the mechanism. In this paper, we propose a computational model of consciousness, the ”Global Abstraction Workspace” which is an internal environmental modelling perceived as a multi-agent system. This system is able to evolve and generate new data and processes as well as actions in the environment.Keywords: artificial consciousness, cognitive architecture, global abstraction workspace, multi-agent system
Procedia PDF Downloads 3412273 Business Strategy, Crisis and Digitalization
Authors: Flora Xu, Marta Fernandez Olmos
Abstract:
This article is mainly about critical assessment and comprehensive understanding of the business strategy in the post COVID-19 scenario. This study aims to elucidate how companies are responding to the unique challenges posed by the pandemic and how these measures are shaping the future of the business environment. The pandemic has exposed the fragility and flexibility of the global supply chain, and procurement and production strategies should be reconsidered. It should increase the diversity of suppliers and the flexibility of the supply chain, and some companies are considering transferring their survival to the local market. This can increase local employment and reduce international transportation disruptions and customs issues. By shortening the distance between production and market, companies can respond more quickly to changes in demand and unforeseen events. The demand for remote work and online solutions will increase the adoption of digital technology and accelerate the digital transformation of many organizations. Marketing and communication strategies need to adapt to a constantly changing environment. The business resilience strategy was emphasized as a key component of the response to the COVID-19. The company is seeking to strengthen its risk management capabilities and develop a business continuity plan to cope with future unexpected disruptions. The pandemic has reconfigured human resource practices and changed the way companies manage their employees. Remote work has become the norm, and companies focus on managing workers' health and well-being, as well as flexible work policies to ensure operations and support for employees during crises. This change in human resources practice has a lasting impact on how companies apply talent and labor management in the post COVID-19 world. The pandemic has prompted a significant review of business strategies as companies adapt to constantly changing environments and seek to ensure their sustainability and profitability in times of crisis. This strategic reassessment has led to product diversification, exploring international markets and adapting to the changing market. Companies have responded to the unprecedented challenges brought by the COVID-19. The COVID-19 has promoted innovation effort in key areas and focused on the responsibility in today's business strategy for sustainability and the importance of corporate society. The important challenge of formulating and implementing business strategies in uncertain times. These challenges include making quick and agile decisions in turbulent environments, risk management, and adaptability to constantly changing market conditions. The COVID-19 highlights the importance of strategic planning and informed decision-making - making in a business environment characterized by uncertainty and complexity. In short, the pandemic has reconfigured the way companies handle business strategies and emphasized the necessity of preparing for future challenges in a business world marked by uncertainty and complexity.Keywords: business strategy, crisis, digitalization, uncertainty
Procedia PDF Downloads 202272 A Genetic Based Algorithm to Generate Random Simple Polygons Using a New Polygon Merge Algorithm
Authors: Ali Nourollah, Mohsen Movahedinejad
Abstract:
In this paper a new algorithm to generate random simple polygons from a given set of points in a two dimensional plane is designed. The proposed algorithm uses a genetic algorithm to generate polygons with few vertices. A new merge algorithm is presented which converts any two polygons into a simple polygon. This algorithm at first changes two polygons into a polygonal chain and then the polygonal chain is converted into a simple polygon. The process of converting a polygonal chain into a simple polygon is based on the removal of intersecting edges. The merge algorithm has the time complexity of O ((r+s) *l) where r and s are the size of merging polygons and l shows the number of intersecting edges removed from the polygonal chain. It will be shown that 1 < l < r+s. The experiments results show that the proposed algorithm has the ability to generate a great number of different simple polygons and has better performance in comparison to celebrated algorithms such as space partitioning and steady growth.Keywords: Divide and conquer, genetic algorithm, merge polygons, Random simple polygon generation.
Procedia PDF Downloads 5352271 Study on Optimization of Air Infiltration at Entrance of a Commercial Complex in Zhejiang Province
Authors: Yujie Zhao, Jiantao Weng
Abstract:
In the past decade, with the rapid development of China's economy, the purchasing power and physical demand of residents have been improved, which results in the vast emergence of public buildings like large shopping malls. However, the architects usually focus on the internal functions and streamlines of these buildings, ignoring the impact of the environment on the subjective feelings of building users. Only in Zhejiang province, the infiltration of cold air in winter frequently occurs at the entrance of sizeable commercial complex buildings that have been in operation, which will affect the environmental comfort of the building lobby and internal public spaces. At present, to reduce these adverse effects, it is usually adopted to add active equipment, such as setting air curtains to block air exchange or adding heating air conditioners. From the perspective of energy consumption, the infiltration of cold air into the entrance will increase the heat consumption of indoor heating equipment, which will indirectly cause considerable economic losses during the whole winter heating stage. Therefore, it is of considerable significance to explore the suitable entrance forms for improving the environmental comfort of commercial buildings and saving energy. In this paper, a commercial complex with apparent cold air infiltration problem in Hangzhou is selected as the research object to establish a model. The environmental parameters of the building entrance, including temperature, wind speed, and infiltration air volume, are obtained by Computational Fluid Dynamics (CFD) simulation, from which the heat consumption caused by the natural air infiltration in the winter and its potential economic loss is estimated as the objective metric. This study finally obtains the optimization direction of the building entrance form of the commercial complex by comparing the simulation results of other local commercial complex projects with different entrance forms. The conclusions will guide the entrance design of the same type of commercial complex in this area.Keywords: air infiltration, commercial complex, heat consumption, CFD simulation
Procedia PDF Downloads 1362270 RFID Based Student Attendance System
Authors: Aniket Tiwari, Ameya London
Abstract:
Web-based student attendance management system is required to assist the faculty and the lecturer for the time-consuming process. For this purpose, GSM/GPRS (Global System for Mobile Communication/General Packet Radio Service) based student’s attendance management system using RFID (Radio Frequency Identification) is a much convenient method to take the attendance. Student is provided with the RFID tags. When student comes near to the reader, it will sense the respective student and update attendance. The whole process is controlled using the microcontroller. The main advantage of this system is that it reduced the complexity comparison to student attendance system using RF technology. This system requires only one microcontroller for the operation, it is real time process. This paper reviews some of these monitoring systems and proposes a GPRS based student attendance system. The system can be easily accessed by the lecturers via the web and most importantly, the reports can be generated in real-time processing, thus, provides valuable information about the students’ commitments in attending the classes.Keywords: RFID reader, RFID tags, student, attendance
Procedia PDF Downloads 5122269 Diffusion Adaptation Strategies for Distributed Estimation Based on the Family of Affine Projection Algorithms
Authors: Mohammad Shams Esfand Abadi, Mohammad Ranjbar, Reza Ebrahimpour
Abstract:
This work presents the distributed processing solution problem in a diffusion network based on the adapt then combine (ATC) and combine then adapt (CTA)selective partial update normalized least mean squares (SPU-NLMS) algorithms. Also, we extend this approach to dynamic selection affine projection algorithm (DS-APA) and ATC-DS-APA and CTA-DS-APA are established. The purpose of ATC-SPU-NLMS and CTA-SPU-NLMS algorithm is to reduce the computational complexity by updating the selected blocks of weight coefficients at every iteration. In CTA-DS-APA and ATC-DS-APA, the number of the input vectors is selected dynamically. Diffusion cooperation strategies have been shown to provide good performance based on these algorithms. The good performance of introduced algorithm is illustrated with various experimental results.Keywords: selective partial update, affine projection, dynamic selection, diffusion, adaptive distributed networks
Procedia PDF Downloads 7092268 Tools for Analysis and Optimization of Standalone Green Microgrids
Authors: William Anderson, Kyle Kobold, Oleg Yakimenko
Abstract:
Green microgrids using mostly renewable energy (RE) for generation, are complex systems with inherent nonlinear dynamics. Among a variety of different optimization tools there are only a few ones that adequately consider this complexity. This paper evaluates applicability of two somewhat similar optimization tools tailored for standalone RE microgrids and also assesses a machine learning tool for performance prediction that can enhance the reliability of any chosen optimization tool. It shows that one of these microgrid optimization tools has certain advantages over another and presents a detailed routine of preparing input data to simulate RE microgrid behavior. The paper also shows how neural-network-based predictive modeling can be used to validate and forecast solar power generation based on weather time series data, which improves the overall quality of standalone RE microgrid analysis.Keywords: microgrid, renewable energy, complex systems, optimization, predictive modeling, neural networks
Procedia PDF Downloads 2832267 Discrimination during a Resume Audit: The Impact of Job Context in Hiring
Authors: Alexandra Roy
Abstract:
Building on literature on cognitive matching and social categorization and using the correspondence testing method, we test the interaction effect of person characteristics (Gender with physical attractiveness) and job context (client contact, industry status, coworker contact). As expected, while findings show a strong impact of gender with beauty on hiring chances, job context characteristics have also a significant overall effect of this hiring outcome. Moreover, the rate of positive responses varies according some of the recruiter’s characteristics. Results are robust to various sensitivity checks. Implications of the results, limitations of the study, and directions for future research are discussed.Keywords: correspondence testing, discrimination, hiring, physical attractiveness
Procedia PDF Downloads 2092266 Coupling Large Language Models with Disaster Knowledge Graphs for Intelligent Construction
Authors: Zhengrong Wu, Haibo Yang
Abstract:
In the context of escalating global climate change and environmental degradation, the complexity and frequency of natural disasters are continually increasing. Confronted with an abundance of information regarding natural disasters, traditional knowledge graph construction methods, which heavily rely on grammatical rules and prior knowledge, demonstrate suboptimal performance in processing complex, multi-source disaster information. This study, drawing upon past natural disaster reports, disaster-related literature in both English and Chinese, and data from various disaster monitoring stations, constructs question-answer templates based on large language models. Utilizing the P-Tune method, the ChatGLM2-6B model is fine-tuned, leading to the development of a disaster knowledge graph based on large language models. This serves as a knowledge database support for disaster emergency response.Keywords: large language model, knowledge graph, disaster, deep learning
Procedia PDF Downloads 582265 A FE-Based Scheme for Computing Wave Interaction with Nonlinear Damage and Generation of Harmonics in Layered Composite Structures
Authors: R. K. Apalowo, D. Chronopoulos
Abstract:
A Finite Element (FE) based scheme is presented for quantifying guided wave interaction with Localised Nonlinear Structural Damage (LNSD) within structures of arbitrary layering and geometric complexity. The through-thickness mode-shape of the structure is obtained through a wave and finite element method. This is applied in a time domain FE simulation in order to generate time harmonic excitation for a specific wave mode. Interaction of the wave with LNSD within the system is computed through an element activation and deactivation iteration. The scheme is validated against experimental measurements and a WFE-FE methodology for calculating wave interaction with damage. Case studies for guided wave interaction with crack and delamination are presented to verify the robustness of the proposed method in classifying and identifying damage.Keywords: layered structures, nonlinear ultrasound, wave interaction with nonlinear damage, wave finite element, finite element
Procedia PDF Downloads 1642264 The Impact of Shariah Non-Compliance Risk on Islamic Financial Institutions
Authors: Ibtissam Mharzi Alaoui, Camélia Sehaqui
Abstract:
The success of a bank depends upon its effective risk management. With the growing complexity and diversity of financial products and services, as well as the accelerating pace of globalization over the past decade, risk management is becoming increasingly difficult. thus, all measurement and monitoring functions must be much more vigorous, relevant and adequate. The Shariah non-compliance risk is specific aspect of Islamic finance which ipso facto, deserves particular attention. It affects the validity of all Islamic financial contracts and it turns out to be likely to result in considerable losses on the overall Islamic financial institutions (IFIs). The purpose of this paper is to review the theoretical literature on Shariah non-compliance risk in order to give a clearer understanding of its sources, causes and consequences. Our intention through this work is to bring added value to the Islamic finance industry all over the world. The findings provide a useful reference work for the Islamic banks in structuring (or restructuring) of their own system of shariah risk management and internal control.Keywords: Shariah non-compliance, risk management, financial products, Islamic finance.
Procedia PDF Downloads 932263 Maintaining the Tension between the Classic Seduction Theory and the Role of Unconscious Fantasies
Authors: Galit Harel
Abstract:
This article describes the long-term psychoanalytic psychotherapy of a young woman who had experienced trauma during her childhood. The details of the trauma were unknown, as all memory of the trauma had been repressed. Past trauma is analyzable through a prism of transference, dreaming and dreams, mental states, and thinking processes that offer an opportunity to explore and analyze the influence of both reality and fantasy on the patient. The presented case describes a therapeutic process that strives to discover hidden meanings through the unconscious system and illustrates the movement from unconscious to conscious during exploration of the patient’s personal trauma in treatment. The author discusses the importance of classical and contemporary psychoanalytic models of childhood sexual trauma through the discovery of manifest and latent content, unconscious fantasies, and actual events of trauma. It is suggested that the complexity of trauma is clarified by the tension between these models and by the inclusion of aspects of both of them for a complete understanding.Keywords: dreams, psychoanalytic psychotherapy, thinking processes, transference, trauma
Procedia PDF Downloads 932262 Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines
Authors: Alexander Guzman Urbina, Atsushi Aoyama
Abstract:
The sustainability of traditional technologies employed in energy and chemical infrastructure brings a big challenge for our society. Making decisions related with safety of industrial infrastructure, the values of accidental risk are becoming relevant points for discussion. However, the challenge is the reliability of the models employed to get the risk data. Such models usually involve large number of variables and with large amounts of uncertainty. The most efficient techniques to overcome those problems are built using Artificial Intelligence (AI), and more specifically using hybrid systems such as Neuro-Fuzzy algorithms. Therefore, this paper aims to introduce a hybrid algorithm for risk assessment trained using near-miss accident data. As mentioned above the sustainability of traditional technologies related with energy and chemical infrastructure constitutes one of the major challenges that today’s societies and firms are facing. Besides that, the adaptation of those technologies to the effects of the climate change in sensible environments represents a critical concern for safety and risk management. Regarding this issue argue that social consequences of catastrophic risks are increasing rapidly, due mainly to the concentration of people and energy infrastructure in hazard-prone areas, aggravated by the lack of knowledge about the risks. Additional to the social consequences described above, and considering the industrial sector as critical infrastructure due to its large impact to the economy in case of a failure the relevance of industrial safety has become a critical issue for the current society. Then, regarding the safety concern, pipeline operators and regulators have been performing risk assessments in attempts to evaluate accurately probabilities of failure of the infrastructure, and consequences associated with those failures. However, estimating accidental risks in critical infrastructure involves a substantial effort and costs due to number of variables involved, complexity and lack of information. Therefore, this paper aims to introduce a well trained algorithm for risk assessment using deep learning, which could be capable to deal efficiently with the complexity and uncertainty. The advantage point of the deep learning using near-miss accidents data is that it could be employed in risk assessment as an efficient engineering tool to treat the uncertainty of the risk values in complex environments. The basic idea of using a Near-Miss Deep Learning Approach for Neuro-Fuzzy Risk Assessment in Pipelines is focused in the objective of improve the validity of the risk values learning from near-miss accidents and imitating the human expertise scoring risks and setting tolerance levels. In summary, the method of Deep Learning for Neuro-Fuzzy Risk Assessment involves a regression analysis called group method of data handling (GMDH), which consists in the determination of the optimal configuration of the risk assessment model and its parameters employing polynomial theory.Keywords: deep learning, risk assessment, neuro fuzzy, pipelines
Procedia PDF Downloads 2932261 Technological Affordances: Guidelines for E-Learning Design
Authors: Clement Chimezie Aladi, Itamar Shabtai
Abstract:
A review of the literature in the last few years reveals that little attention has been paid to technological affordances in e-learning designs. However, affordances are key to engaging students and enabling teachers to actualize learning goals. E-learning systems (software and artifacts) need to be designed in such a way that the features facilitate perceptions of the affordances with minimal cognition. This study aimed to fill this gap in the literature and encourage further research in this area. It provides guidelines for facilitating the perception of affordances in e-learning design and advances Technology Affordance and Constraints Theory by incorporating the affordance-based design process, the principles of multimedia learning, e-learning design philosophy, and emotional and cognitive affordances.Keywords: e-learning, technology affrodances, affordance based design, e-learning design
Procedia PDF Downloads 642260 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Pegah Eshraghi, Zahra Sadat Zomorodian, Mohammad Tahsildoost
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 101