Search results for: combining%20flow
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 989

Search results for: combining%20flow

389 Generating Synthetic Chest X-ray Images for Improved COVID-19 Detection Using Generative Adversarial Networks

Authors: Muneeb Ullah, Daishihan, Xiadong Young

Abstract:

Deep learning plays a crucial role in identifying COVID-19 and preventing its spread. To improve the accuracy of COVID-19 diagnoses, it is important to have access to a sufficient number of training images of CXRs (chest X-rays) depicting the disease. However, there is currently a shortage of such images. To address this issue, this paper introduces COVID-19 GAN, a model that uses generative adversarial networks (GANs) to generate realistic CXR images of COVID-19, which can be used to train identification models. Initially, a generator model is created that uses digressive channels to generate images of CXR scans for COVID-19. To differentiate between real and fake disease images, an efficient discriminator is developed by combining the dense connectivity strategy and instance normalization. This approach makes use of their feature extraction capabilities on CXR hazy areas. Lastly, the deep regret gradient penalty technique is utilized to ensure stable training of the model. With the use of 4,062 grape leaf disease images, the Leaf GAN model successfully produces 8,124 COVID-19 CXR images. The COVID-19 GAN model produces COVID-19 CXR images that outperform DCGAN and WGAN in terms of the Fréchet inception distance. Experimental findings suggest that the COVID-19 GAN-generated CXR images possess noticeable haziness, offering a promising approach to address the limited training data available for COVID-19 model training. When the dataset was expanded, CNN-based classification models outperformed other models, yielding higher accuracy rates than those of the initial dataset and other augmentation techniques. Among these models, ImagNet exhibited the best recognition accuracy of 99.70% on the testing set. These findings suggest that the proposed augmentation method is a solution to address overfitting issues in disease identification and can enhance identification accuracy effectively.

Keywords: classification, deep learning, medical images, CXR, GAN.

Procedia PDF Downloads 60
388 Organizational Innovativeness: Motivation in Employee’s Innovative Work Behaviors

Authors: P. T. Ngan

Abstract:

Purpose: The study aims to answer the question what are motivational conditions that have great influences on employees’ innovative work behaviors by investigating the case of SATAMANKULMA/ Anya Productions Ky in Kuopio, Finland. Design/methodology: The main methodology utilized was the qualitative single case study research, analysis was conducted with an adapted thematic content analysis procedure, created from empirical material that was collected through interviews, observation and document review. Findings: The paper highlights the significance of combining relevant synergistic extrinsic and intrinsic motivations into the organizational motivation system. The findings show that intrinsic drives are essential for the initiation phases while extrinsic drives are more important for the implementation phases of innovative work behaviors. The study also offers the IDEA motivation model-interpersonal relationships & networks, development opportunities, economic constituent and application supports as an ideal tool to optimize business performance. Practical limitations/ implications: The research was only conducted from the perspective of SATAMANKULMA/Anya Productions Ky, with five interviews, a few observations and with several reviewed documents. However, further research is required to include other stakeholders, such as the customers, partner companies etc. Also the study does not offer statistical validity of the findings; an extensive case study or a qualitative multiple case study is suggested to compare the findings and provide information as to whether IDEA model relevant in other types of firms. Originality/value: Neither the innovation nor the human resource management field provides a detailed overview of specific motivational conditions might use to stimulate innovative work behaviors of individual employees. This paper fills that void.

Keywords: employee innovative work behaviors, extrinsic motivation, intrinsic motivation, organizational innovativeness

Procedia PDF Downloads 248
387 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules

Authors: Michael Naderhirn, Marco Pavone

Abstract:

Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.

Keywords: formal system verification, reachability, real time controller, hybrid system

Procedia PDF Downloads 222
386 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 63
385 Semantic Search Engine Based on Query Expansion with Google Ranking and Similarity Measures

Authors: Ahmad Shahin, Fadi Chakik, Walid Moudani

Abstract:

Our study is about elaborating a potential solution for a search engine that involves semantic technology to retrieve information and display it significantly. Semantic search engines are not used widely over the web as the majorities are still in Beta stage or under construction. Many problems face the current applications in semantic search, the major problem is to analyze and calculate the meaning of query in order to retrieve relevant information. Another problem is the ontology based index and its updates. Ranking results according to concept meaning and its relation with query is another challenge. In this paper, we are offering a light meta-engine (QESM) which uses Google search, and therefore Google’s index, with some adaptations to its returned results by adding multi-query expansion. The mission was to find a reliable ranking algorithm that involves semantics and uses concepts and meanings to rank results. At the beginning, the engine finds synonyms of each query term entered by the user based on a lexical database. Then, query expansion is applied to generate different semantically analogous sentences. These are generated randomly by combining the found synonyms and the original query terms. Our model suggests the use of semantic similarity measures between two sentences. Practically, we used this method to calculate semantic similarity between each query and the description of each page’s content generated by Google. The generated sentences are sent to Google engine one by one, and ranked again all together with the adapted ranking method (QESM). Finally, our system will place Google pages with higher similarities on the top of the results. We have conducted experimentations with 6 different queries. We have observed that most ranked results with QESM were altered with Google’s original generated pages. With our experimented queries, QESM generates frequently better accuracy than Google. In some worst cases, it behaves like Google.

Keywords: semantic search engine, Google indexing, query expansion, similarity measures

Procedia PDF Downloads 407
384 Multi-Sensor Image Fusion for Visible and Infrared Thermal Images

Authors: Amit Kumar Happy

Abstract:

This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity.

Keywords: image fusion, IR thermal imager, multi-sensor, multi-scale transform

Procedia PDF Downloads 89
383 Non-Cognitive Skills Associated with Learning in a Serious Gaming Environment: A Pretest-Posttest Experimental Design

Authors: Tanja Kreitenweis

Abstract:

Lifelong learning is increasingly seen as essential for coping with the rapidly changing work environment. To this end, serious games can provide convenient and straightforward access to complex knowledge for all age groups. However, learning achievements depend largely on a learner’s non-cognitive skill disposition (e.g., motivation, self-belief, playfulness, and openness). With the aim of combining the fields of serious games and non-cognitive skills, this research focuses in particular on the use of a business simulation, which conveys change management insights. Business simulations are a subset of serious games and are perceived as a non-traditional learning method. The presented objectives of this work are versatile: (1) developing a scale, which measures learners’ knowledge and skills level before and after a business simulation was played, (2) investigating the influence of non-cognitive skills on learning in this business simulation environment and (3) exploring the moderating role of team preference in this type of learning setting. First, expert interviews have been conducted to develop an appropriate measure for learners’ skills and knowledge assessment. A pretest-posttest experimental design with German management students was implemented to approach the remaining objectives. By using the newly developed, reliable measure, it was found that students’ skills and knowledge state were higher after the simulation had been played, compared to before. A hierarchical regression analysis revealed two positive predictors for this outcome: motivation and self-esteem. Unexpectedly, playfulness had a negative impact. Team preference strengthened the link between grit and playfulness, respectively, and learners’ skills and knowledge state after completing the business simulation. Overall, the data underlined the potential of business simulations to improve learners’ skills and knowledge state. In addition, motivational factors were found as predictors for benefitting most from the applied business simulation. Recommendations are provided for how pedagogues can use these findings.

Keywords: business simulations, change management, (experiential) learning, non-cognitive skills, serious games

Procedia PDF Downloads 89
382 The Relationship between Representational Conflicts, Generalization, and Encoding Requirements in an Instance Memory Network

Authors: Mathew Wakefield, Matthew Mitchell, Lisa Wise, Christopher McCarthy

Abstract:

The properties of memory representations in artificial neural networks have cognitive implications. Distributed representations that encode instances as a pattern of activity across layers of nodes afford memory compression and enforce the selection of a single point in instance space. These encoding schemes also appear to distort the representational space, as well as trading off the ability to validate that input information is within the bounds of past experience. In contrast, a localist representation which encodes some meaningful information into individual nodes in a network layer affords less memory compression while retaining the integrity of the representational space. This allows the validity of an input to be determined. The validity (or familiarity) of input along with the capacity of localist representation for multiple instance selections affords a memory sampling approach that dynamically balances the bias-variance trade-off. When the input is familiar, bias may be high by referring only to the most similar instances in memory. When the input is less familiar, variance can be increased by referring to more instances that capture a broader range of features. Using this approach in a localist instance memory network, an experiment demonstrates a relationship between representational conflict, generalization performance, and memorization demand. Relatively small sampling ranges produce the best performance on a classic machine learning dataset of visual objects. Combining memory validity with conflict detection produces a reliable confidence judgement that can separate responses with high and low error rates. Confidence can also be used to signal the need for supervisory input. Using this judgement, the need for supervised learning as well as memory encoding can be substantially reduced with only a trivial detriment to classification performance.

Keywords: artificial neural networks, representation, memory, conflict monitoring, confidence

Procedia PDF Downloads 105
381 The Effect of Composite Hybridization on the Back Face Deformation of Armor Plates

Authors: Attef Kouadria, Yehya Bouteghrine, Amar Manaa, Tarek Mouats, Djalel Eddine Tria, Hamid Abdelhafid Ghouti

Abstract:

Personal protection systems have been used in several forms for centuries. The need for light-weight composite structures has been in great demand due to their weight and high mechanical properties ratios in comparison to heavy and cumbersome steel plates. In this regard, lighter ceramic plates with a backing plate made of high strength polymeric fibers, mostly aramids, are widely used for protection against ballistic threats. This study aims to improve the ballistic performance of ceramic/composite plates subjected to ballistic impact by reducing the back face deformation (BFD) measured after each test. A new hybridization technique was developed in this investigation to increase the energy absorption capabilities of the backing plates. The hybridization consists of combining different types of aramid fabrics with different linear densities of aramid fibers (Dtex) and areal densities with an epoxy resin to form the backing plate. Therefore, several composite structures architectures were prepared and tested. For better understanding the effect of the hybridization, a serial of tensile, compression, and shear tests were conducted to determine the mechanical properties of the homogeneous composite materials prepared from different fabrics. It was found that the hybridization allows the backing plate to combine between the mechanical properties of the used fabrics. Aramid fabrics with higher Dtex were found to increase the mechanical strength of the backing plate, while those with lower Dtex found to enhance the lateral wave dispersion ratio due to their lower areal density. Therefore, the back face deformation was significantly reduced in comparison to a homogeneous composite plate.

Keywords: aramid fabric, ballistic impact, back face deformation, body armor, composite, mechanical testing

Procedia PDF Downloads 124
380 Combining the Fictitious Stress Method and Displacement Discontinuity Method in Solving Crack Problems in Anisotropic Material

Authors: Bahatti̇n Ki̇mençe, Uğur Ki̇mençe

Abstract:

In this study, the purpose of obtaining the influence functions of the displacement discontinuity in an anisotropic elastic medium is to produce the boundary element equations. A Displacement Discontinuous Method formulation (DDM) is presented with the aim of modeling two-dimensional elastic fracture problems. This formulation is found by analytical integration of the fundamental solution along a straight-line crack. With this purpose, Kelvin's fundamental solutions for anisotropic media on an infinite plane are used to form dipoles from singular loads, and the various combinations of the said dipoles are used to obtain the influence functions of displacement discontinuity. This study introduces a technique for coupling Fictitious Stress Method (FSM) and DDM; the reason for applying this technique to some examples is to demonstrate the effectiveness of the proposed coupling method. In this study, displacement discontinuity equations are obtained by using dipole solutions calculated with known singular force solutions in an anisotropic medium. The displacement discontinuities method obtained from the solutions of these equations and the fictitious stress methods is combined and compared with various examples. In this study, one or more crack problems with various geometries in rectangular plates in finite and infinite regions, under the effect of tensile stress with coupled FSM and DDM in the anisotropic environment, were examined, and the effectiveness of the coupled method was demonstrated. Since crack problems can be modeled more easily with DDM, it has been observed that the use of DDM has increased recently. In obtaining the displacement discontinuity equations, Papkovitch functions were used in Crouch, and harmonic functions were chosen to satisfy various boundary conditions. A comparison is made between two indirect boundary element formulations, DDM, and an extension of FSM, for solving problems involving cracks. Several numerical examples are presented, and the outcomes are contrasted to existing analytical or reference outs.

Keywords: displacement discontinuity method, fictitious stress method, crack problems, anisotropic material

Procedia PDF Downloads 61
379 Sentiment Analysis of Social Media Responses: A Comparative Study of (NDA) and Indian National Developmental Inclusive Alliance (INDIA) during Indian General Elections 2024

Authors: Pankaj Dhiman, Simranjeet Kaur

Abstract:

This research paper presents a comprehensive sentiment analysis of social media responses to videos on Facebook, YouTube, Twitter, and Instagram during the 2024 Indian general elections. The study focuses on the sentiment patterns of voters towards the National Democratic Alliance (NDA) and The Indian National Developmental Inclusive Alliance (INDIA) on these platforms. The analysis aims to understand the impact of social media on voter sentiment and its correlation with the election outcome. The study employed a mixed-methods approach, combining both quantitative and qualitative methods. With a total of 200 posts analysed during general election-2024 final phase, the sentiment analysis was conducted using natural language processing (NLP) techniques, including sentiment dictionaries and machine learning algorithms. The results show that NDA received significantly more positive sentiment responses across all platforms, with a positive sentiment score of 47% compared to INDIA's score of 38.98 %. The analysis also revealed that Twitter and YouTube were the most influential platforms in shaping voter sentiment, with 60% of the total sentiment score coming from these two platforms. The study's findings suggest that social media sentiment analysis can be a valuable tool for understanding voter sentiment and predicting election outcomes. The results also highlight the importance of social media in shaping public opinion and the need for political parties to engage effectively with voters on these platforms. The study's implications are significant, as they indicate that social media can be a key factor in determining the outcome of elections. The findings also underscore the need for political parties to develop effective social media strategies to engage with voters and shape public opinion.

Keywords: Indian Elections-2024, NDA, INDIA, sentiment analysis, social media, democracy

Procedia PDF Downloads 15
378 Disease Trajectories in Relation to Poor Sleep Health in the UK Biobank

Authors: Jiajia Peng, Jianqing Qiu, Jianjun Ren, Yu Zhao

Abstract:

Background: Insufficient sleep has been focused on as a public health epidemic. However, a comprehensive analysis of disease trajectory associated with unhealthy sleep habits is still unclear currently. Objective: This study sought to comprehensively clarify the disease's trajectory in relation to the overall poor sleep pattern and unhealthy sleep behaviors separately. Methods: 410,682 participants with available information on sleep behaviors were collected from the UK Biobank at the baseline visit (2006-2010). These participants were classified as having high- and low risk of each sleep behavior and were followed from 2006 to 2020 to identify the increased risks of diseases. We used Cox regression to estimate the associations of high-risk sleep behaviors with the elevated risks of diseases, and further established diseases trajectory using significant diseases. The low-risk unhealthy sleep behaviors were defined as the reference. Thereafter, we also examined the trajectory of diseases linked with the overall poor sleep pattern by combining all of these unhealthy sleep behaviors. To visualize the disease's trajectory, network analysis was used for presenting these trajectories. Results: During a median follow-up of 12.2 years, we noted 12 medical conditions in relation to unhealthy sleep behaviors and the overall poor sleep pattern among 410,682 participants with a median age of 58.0 years. The majority of participants had unhealthy sleep behaviors; in particular, 75.62% with frequent sleeplessness, and 72.12% had abnormal sleep durations. Besides, a total of 16,032 individuals with an overall poor sleep pattern were identified. In general, three major disease clusters were associated with overall poor sleep status and unhealthy sleep behaviors according to the disease trajectory and network analysis, mainly in the digestive, musculoskeletal and connective tissue, and cardiometabolic systems. Of note, two circularity disease pairs (I25→I20 and I48→I50) showed the highest risks following these unhealthy sleep habits. Additionally, significant differences in disease trajectories were observed in relation to sex and sleep medication among individuals with poor sleep status. Conclusions: We identified the major disease clusters and high-risk diseases following participants with overall poor sleep health and unhealthy sleep behaviors, respectively. It may suggest the need to investigate the potential interventions targeting these key pathways.

Keywords: sleep, poor sleep, unhealthy sleep behaviors, disease trajectory, UK Biobank

Procedia PDF Downloads 64
377 Simultaneous Measurement of Displacement and Roll Angle of Object

Authors: R. Furutani, K. Ishii

Abstract:

Laser interferometers are now widely used for length and displacement measurement. In conventional methods, the optical path difference between two mirrors, one of which is a reference mirror and the other is a target mirror, is measured, as in Michelson interferometry, or two target mirrors are set up and the optical path difference between the two targets is measured, as in differential interferometry. In these interferometers, the two laser beams pass through different optical elements so that the measurement result is affected by the vibration and other effects in the optical paths. In addition, it is difficult to measure the roll angle around the optical axis. The proposed interferometer simultaneously measures both the translational motion along the optical axis and the roll motion around it by combining the retroreflective principle of the ball lens (BL) and the polarization. This interferometer detects the interferogram by the two beams traveling along the identical optical path from the beam source to BL. This principle is expected to reduce external influences by using the interferogram between the two lasers in an identical optical path. The proposed interferometer uses a BL so that the reflected light from the lens travels on the identical optical path as the incident light. After reaching the aperture of the He-Ne laser oscillator, the reflected light is reflected by a mirror with a very high reflectivity installed in the aperture and is irradiated back toward the BL. Both the first laser beam that enters the BL and the second laser beam that enters the BL after the round trip interferes with each other, enabling the measurement of displacement along the optical axis. In addition, for the measurement of the roll motion, a quarter-wave plate is installed on the optical path to change the polarization state of the laser. The polarization states of the first laser beam and second laser beam are different by the roll angle of the target. As a result, this system can measure the displacement and the roll angle of BL simultaneously. It was verified by the simulation and the experiment that the proposed optical system could measure the displacement and the roll angle simultaneously.

Keywords: common path interferometer, displacement measurement, laser interferometer, simultaneous measurement, roll angle measurement

Procedia PDF Downloads 67
376 Monitoring Spatial Distribution of Blue-Green Algae Blooms with Underwater Drones

Authors: R. L. P. De Lima, F. C. B. Boogaard, R. E. De Graaf-Van Dinther

Abstract:

Blue-green algae blooms (cyanobacteria) is currently a relevant ecological problem that is being addressed by most water authorities in the Netherlands. These can affect recreation areas by originating unpleasant smells and toxins that can poison humans and animals (e.g. fish, ducks, dogs). Contamination events usually take place during summer months, and their frequency is increasing with climate change. Traditional monitoring of this bacteria is expensive, labor-intensive and provides only limited (point sampling) information about the spatial distribution of algae concentrations. Recently, a novel handheld sensor allowed water authorities to quicken their algae surveying and alarm systems. This study converted the mentioned algae sensor into a mobile platform, by combining it with an underwater remotely operated vehicle (also equipped with other sensors and cameras). This provides a spatial visualization (mapping) of algae concentrations variations within the area covered with the drone, and also in depth. Measurements took place in different locations in the Netherlands: i) lake with thick silt layers at the bottom, very eutrophic former bottom of the sea and frequent / intense mowing regime; ii) outlet of waste water into large reservoir; iii) urban canal system. Results allowed to identify probable dominant causes of blooms (i), provide recommendations for the placement of an outlet, day-night differences in algae behavior (ii), or the highlight / pinpoint higher algae concentration areas (iii). Although further research is still needed to fully characterize these processes and to optimize the measuring tool (underwater drone developments / improvements), the method here presented can already provide valuable information about algae behavior and spatial / temporal variability and shows potential as an efficient monitoring system.

Keywords: blue-green algae, cyanobacteria, underwater drones / ROV / AUV, water quality monitoring

Procedia PDF Downloads 182
375 Multimedia Container for Autonomous Car

Authors: Janusz Bobulski, Mariusz Kubanek

Abstract:

The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.

Keywords: an autonomous car, image processing, lidar, obstacle detection

Procedia PDF Downloads 201
374 Creating Complementary Bi-Modal Learning Environments: An Exploratory Study Combining Online and Classroom Techniques

Authors: Justin P. Pool, Haruyo Yoshida

Abstract:

This research focuses on the effects of creating an English as a foreign language curriculum that combines online learning and classroom teaching in a complementary manner. Through pre- and post-test results, teacher observation, and learner reflection, it will be shown that learners can benefit from online programs focusing on receptive skills if combined with a communicative classroom environment that encourages learners to develop their productive skills. Much research has lamented the fact that many modern mobile assisted language learning apps do not take advantage of the affordances of modern technology by focusing only on receptive skills rather than inviting learners to interact with one another and develop communities of practice. This research takes into account the realities of the state of such apps and focuses on how to best create a curriculum that complements apps which focus on receptive skills. The research involved 15 adult learners working for a business in Japan simultaneously engaging in 1) a commercial online English language learning application that focused on reading, listening, grammar, and vocabulary and 2) a 15-week class focused on communicative language teaching, presentation skills, and mitigation of error aversion tendencies. Participants of the study experienced large gains on a standardized test, increased motivation and willingness to communicate, and asserted that they felt more confident regarding English communication. Moreover, learners continued to study independently at higher rates after the study than they had before the onset of the program. This paper will include the details of the program, reveal the improvement in test scores, share learner reflections, and critically view current evaluation models for mobile assisted language learning applications.

Keywords: adult learners, communicative language teaching, mobile assisted language learning, motivation

Procedia PDF Downloads 117
373 Green Crypto Mining: A Quantitative Analysis of the Profitability of Bitcoin Mining Using Excess Wind Energy

Authors: John Dorrell, Matthew Ambrosia, Abilash

Abstract:

This paper employs econometric analysis to quantify the potential profit wind farms can receive by allocating excess wind energy to power bitcoin mining machines. Cryptocurrency mining consumes a substantial amount of electricity worldwide, and wind energy produces a significant amount of energy that is lost because of the intermittent nature of the resource. Supply does not always match consumer demand. By combining the weaknesses of these two technologies, we can improve efficiency and a sustainable path to mine cryptocurrencies. This paper uses historical wind energy from the ERCOT network in Texas and cryptocurrency data from 2000-2021, to create 4-year return on investment projections. Our research model incorporates the price of bitcoin, the price of the miner, the hash rate of the miner relative to the network hash rate, the block reward, the bitcoin transaction fees awarded to the miners, the mining pool fees, the cost of the electricity and the percentage of time the miner will be running to demonstrate that wind farms generate enough excess energy to mine bitcoin profitably. Excess wind energy can be used as a financial battery, which can utilize wasted electricity by changing it into economic energy. The findings of our research determine that wind energy producers can earn profit while not taking away much if any, electricity from the grid. According to our results, Bitcoin mining could give as much as 1347% and 805% return on investment with the starting dates of November 1, 2021, and November 1, 2022, respectively, using wind farm curtailment. This paper is helpful to policymakers and investors in determining efficient and sustainable ways to power our economic future. This paper proposes a practical solution for the problem of crypto mining energy consumption and creates a more sustainable energy future for Bitcoin.

Keywords: bitcoin, mining, economics, energy

Procedia PDF Downloads 7
372 Valorization of Waste and By-products for Protein Extraction and Functional Properties

Authors: Lorena Coelho, David Ramada, Catarina Nobre, Joaquim Gaião, Juliana Duarte

Abstract:

The development of processes that allows the valorization of waste and by-products generated by industries is crucial to promote symbiotic relationships between different sectors and is mandatory to “close the loop” in the circular economy paradigm. In recent years, by-products and waste from agro-food and forestry sector have attracted attention due to their potential application and technical characteristics. The extraction of bio-based active compounds to be reused is in line with the circular bioeconomy concept trends, combining the use of renewable resources with the process’s circularity, aiming the waste reduction and encouraging reuse and recycling. Among different types of bio-based materials, which are being explored and can be extracted, proteins fractions are becoming an attractive new raw material. Within this context, BioTrace4Leather project, a collaboration between two Technological Centres – CeNTI and CTIC, and a company of Tanning and Finishing of Leather – Curtumes Aveneda, aims to develop innovative and biologically sustainable solutions for leather industry and accomplish the market circularity trends. Specifically, it aims to the valorisation of waste and by-products from the tannery industry through proteins extraction and the development of an innovative and biologically sustainable materials. The achieved results show that keratin, gelatine, and collagen fractions can be successfully extracted from hair and leather bovine waste. These products could be reintegrated into the industrial manufacturing process to attain innovative and functional textile and leather substrates. ACKNOWLEDGEMENT This work has been developed under BioTrace4Leather scope, a project co-funded by Operational Program for Competitiveness and Internationalization (COMPETE) of PORTUGAL2020, through the European Regional Development Fund (ERDF), under grant agreement Nº POCI-01-0247-FEDER-039867.

Keywords: leather by-products, circular economy, sustainability, protein fractions

Procedia PDF Downloads 133
371 Reallocation of Bed Capacity in a Hospital Combining Discrete Event Simulation and Integer Linear Programming

Authors: Muhammed Ordu, Eren Demir, Chris Tofallis

Abstract:

The number of inpatient admissions in the UK has been significantly increasing over the past decade. These increases cause bed occupancy rates to exceed the target level (85%) set by the Department of Health in England. Therefore, hospital service managers are struggling to better manage key resource such as beds. On the other hand, this severe demand pressure might lead to confusion in wards. For example, patients can be admitted to the ward of another inpatient specialty due to lack of resources (i.e., bed). This study aims to develop a simulation-optimization model to reallocate the available number of beds in a mid-sized hospital in the UK. A hospital simulation model was developed to capture the stochastic behaviours of the hospital by taking into account the accident and emergency department, all outpatient and inpatient services, and the interactions between each other. A couple of outputs of the simulation model (e.g., average length of stay and revenue) were generated as inputs to be used in the optimization model. An integer linear programming was developed under a number of constraints (financial, demand, target level of bed occupancy rate and staffing level) with the aims of maximizing number of admitted patients. In addition, a sensitivity analysis was carried out by taking into account unexpected increases on inpatient demand over the next 12 months. As a result, the major findings of the approach proposed in this study optimally reallocate the available number of beds for each inpatient speciality and reveal that 74 beds are idle. In addition, the findings of the study indicate that the hospital wards will be able to cope with 14% demand increase at most in the projected year. In conclusion, this paper sheds a new light on how best to reallocate beds in order to cope with current and future demand for healthcare services.

Keywords: bed occupancy rate, bed reallocation, discrete event simulation, inpatient admissions, integer linear programming, projected usage

Procedia PDF Downloads 124
370 Integrated Mass Rapid Transit System for Smart City Project in Western India

Authors: Debasis Sarkar, Jatan Talati

Abstract:

This paper is an attempt to develop an Integrated Mass Rapid Transit System (MRTS) for a smart city project in Western India. Integrated transportation is one of the enablers of smart transportation for providing a seamless intercity as well as regional level transportation experience. The success of a smart city project at the city level for transportation is providing proper integration to different mass rapid transit modes by way of integrating information, physical, network of routes fares, etc. The methodology adopted for this study was primary data research through questionnaire survey. The respondents of the questionnaire survey have responded on the issues about their perceptions on the ways and means to improve public transport services in urban cities. The respondents were also required to identify the factors and attributes which might motivate more people to shift towards the public mode. Also, the respondents were questioned about the factors which they feel might restrain the integration of various modes of MRTS. Furthermore, this study also focuses on developing a utility equation for respondents with the help of multiple linear regression analysis and its probability to shift to public transport for certain factors listed in the questionnaire. It has been observed that for shifting to public transport, the most important factors that need to be considered were travel time saving and comfort rating. Also, an Integrated MRTS can be obtained by combining metro rail with BRTS, metro rail with monorail, monorail with BRTS and metro rail with Indian railways. Providing a common smart card to transport users for accessing all the different available modes would be a pragmatic solution towards integration of the available modes of MRTS.

Keywords: mass rapid transit systems, smart city, metro rail, bus rapid transit system, multiple linear regression, smart card, automated fare collection system

Procedia PDF Downloads 247
369 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 88
368 Use of Giant Magneto Resistance Sensors to Detect Micron to Submicron Biologic Objects

Authors: Manon Giraud, Francois-Damien Delapierre, Guenaelle Jasmin-Lebras, Cecile Feraudet-Tarisse, Stephanie Simon, Claude Fermon

Abstract:

Early diagnosis or detection of harmful substances at low level is a growing field of high interest. The ideal test should be cheap, easy to use, quick, reliable, specific, and with very low detection limit. Combining the high specificity of antibodies-functionalized magnetic beads used to immune-capture biologic objects and the high sensitivity of a GMR-based sensors, it is possible to even detect these biologic objects one by one, such as a cancerous cell, a bacteria or a disease biomarker. The simplicity of the detection process makes its use possible even for untrained staff. Giant Magneto Resistance (GMR) is a recently discovered effect consisting in the electrical resistance modification of some conductive layers when exposed to a magnetic field. This effect allows the detection of very low variations of magnetic field (typically a few tens of nanoTesla). Magnetic nanobeads coated with antibodies targeting the analytes are mixed with a biological sample (blood, saliva) and incubated for 45 min. Then the mixture is injected in a very simple microfluidic chip and circulates above a GMR sensor that detects changes in the surrounding magnetic field. Magnetic particles do not create a field sufficient to be detected. Therefore, only the biological objects surrounded by several antibodies-functionalized magnetic beads (that have been captured by the complementary antigens) are detected when they move above the sensor. Proof of concept has been carried out on NS1 mouse cancerous cells diluted in PBS which have been bonded to magnetic 200nm particles. Signals were detected in cells-containing samples while none were recorded for negative controls. Binary response was hence assessed for this first biological model. The precise quantification of the analytes and its detection in highly diluted solution is the step now in progress.

Keywords: early diagnosis, giant magnetoresistance, lab-on-a-chip, submicron particle

Procedia PDF Downloads 230
367 Energy-Led Sustainability Assessment Approach for Energy-Efficient Manufacturing

Authors: Aldona Kluczek

Abstract:

In recent years, manufacturing processes have interacted with sustainability issues realized in the cost-effective ways that minimalize energy, decrease negative impacts on the environment and are safe for society. However, the attention has been on separate sustainability assessment methods considering energy and material flow, energy consumption, and emission release or process control. In this paper, the energy-led sustainability assessment approach combining the methods: energy Life Cycle Assessment to assess environmental impact, Life Cycle Cost to analyze costs, and Social Life Cycle Assessment through ‘energy LCA-based value stream map’, is used to assess the energy sustainability of the hardwood lumber manufacturing process in terms of technologies. The approach integrating environmental, economic and social issues can be visualized in the considered energy-efficient technologies on the map of an energy LCA-related (input and output) inventory data. It will enable the identification of efficient technology of a given process to be reached, through the effective analysis of energy flow. It is also indicated that interventions in the considered technology should focus on environmental, economic improvements to achieve energy sustainability. The results have indicated that the most intense energy losses are caused by a cogeneration technology. The environmental impact analysis shows that a substantial reduction by 34% can be achieved with the improvement of it. From the LCC point of view, the result seems to be cost-effective, when done at that plant where the improvement is used. By demonstrating the social dimension, every component of the energy of plant labor use in the life-cycle process of the lumber production has positive energy benefits. The energy required to install the energy-efficient technology amounts to 30.32 kJ compared to others components of the energy of plant labor and it has the highest value in terms of energy-related social indicators. The paper depicts an example of hardwood lumber production in order to prove the applicability of a sustainability assessment method.

Keywords: energy efficiency, energy life cycle assessment, life cycle cost, social life cycle analysis, manufacturing process, sustainability assessment

Procedia PDF Downloads 229
366 Plant as an Alternative for Anti Depressant Drugs St John's Wort

Authors: Mahdi Akhbardeh

Abstract:

St John's wort plant can help to treat depression disease through decreasing this disease symptom, due to having some similar features of Prozac (Fluoxetine Hcl) pill. People suffering from slight depression who have fear of using antidepressants side effects can use St John's wort drops under doctor observation. This method of treatment is proposed specially to those women who are spending menopause or depression resulted from this period. St John's wort plant have proposed traditional and plant medicine as newest researches in treating mood disorders compared to Prozac (Fluoxetine Hcl) drug in treating depression disease which is being administrated in clinic research center of Washington. Objective: the aim of this study is to find an alternative treatment method in people suffering from depression which are treated with Prozac (Fluoxetine Hcl). Almost 70 percent of treatment failures with Prozac (Fluoxetine Hcl) drug in patients suffering from slight to normal depression is due to intensive side effects including: decrease in blood pressure, reduce in sexual desire and 30 percent of it is due to this drug affectless in treatment procedure which leads to leaving treatment. Results of Hypercuim plant function are exactly similar to antidepressants. Increase in serotonin amount in brain synopsis terminal end causes increase in existence time of this material in this part. In fact these two drugs have similar function. Though side effects of Hypercuim plant(St John's wort) including headache and slight nausea tolerable. Results: St John's wort plant can be used lonely in slight to normal depressions in which patients are avoiding Prozac (Fluoxetine Hcl) drug due to it's side effects. In intensive depressions through which general patients don’t indicate positive response to drug, it is probably expected relative or even complete treatment through combining antidepressants drugs with this plant. This treatment method has been investigated and confirmed in clinical tests and researches.

Keywords: depression, St John's wort, Prozac, antidepressant

Procedia PDF Downloads 463
365 Phelipanche Ramosa (L. - Pomel) Control in Field Tomato Crop

Authors: G. Disciglio, F. Lops, A. Carlucci, G. Gatta, A. Tarantino, L. Frabboni, F. Carriero, F. Cibelli, M. L. Raimondo, E. Tarantino

Abstract:

The Phelipanche ramosa is is an important crop whose cultivation in the Mediterranean basin is severely contained the phitoparasitic weed Phelipanche ramose. The semiarid regions of the world are considered the main center of this parasitic weed, where heavy infestation is due to the ability to produce high numbers of seeds (up to 500,000 per plant), that remain viable for extended period (more than 19 years). In this paper 12 treatments of parasitic weed control including chemical, agronomic, biological and biotechnological methods have been carried out. In 2014 a trial was performed at Foggia (southern Italy). on processing tomato (cv Docet), grown in field infested by Phelipanche ramosa, Tomato seedlings were transplant on May 5, 2014 on a clay-loam soil (USDA) fertilized by 100 kg ha-1 of N; 60 kg ha-1 of P2O5 and 20 kg ha-1 of S. Afterwards, top dressing was performed with 70 kg ha-1 of N. The randomized block design with 3 replicates was adopted. During the growing cycle of the tomato, at 56-78 and 92 days after transplantation, the number of parasitic shoots emerged in each pot was detected. At harvesting, on August 18, the major quantity-quality yield parameters were determined (marketable yield, mean weight, dry matter, pH, soluble solids and color of fruits). All data were subjected to analysis of variance (ANOVA), using the JMP software (SAS Institute Inc., Cary, NC, USA), and for comparison of means was used Tukey's test. Each treatment studied did not provide complete control against Phelipanche ramosa. However among the 12 tested methods, Fusarium, gliphosate, radicon biostimulant and Red Setter tomato cv (improved genotypes obtained by Tilling technology) proved to mitigate the virulence of the attacks of Phelipanche ramose. It is assumed that these effects can be improved by combining some of these treatments each other, especially for a gradual and continuing reduction of the “seed bank” of the parasite in the soil.

Keywords: control methods, Phelipanche ramosa, tomato crop, mediterranean basin

Procedia PDF Downloads 546
364 The Effect of Technology on International Marketing Trading Researches and Analysis

Authors: Karim Monir Halim Salib

Abstract:

The article discusses the use of modern technology to achieve environmental marketing goals in business and customer relations. The purpose of this article is to show the possibilities of the application of modern technology. In B2C relationships, marketing departments face challenges arising from the need to quickly segment customers and share information across multiple systems, which seriously hinders the achievement of marketing objectives. Therefore, the Article states that modern IT solutions are used in the marketing of business activities, taking into account environmental objectives. For this reason, its importance in the economic and social development of developing countries has increased. While traditional companies emphasize profit as the most important business principle, social enterprises have to address social issues at the expense of profit. This mindset gives social enterprises more than traditional businesses to meet the needs of those at the bottom of the pyramid. This also poses a great challenge for social business, as social business works for the public good on the one hand and financial stability on the other. Otherwise, the company cannot be evacuated. Cultures are involved in business communication and research. Using the example of language in international relations, the article poses the problem of cultural discourse in management and linguistic and cultural studies. After reviewing current research on language in international relations, this article presents communication methods in the international economy from a linguistic perspective and attempts to explain communication problems in business from the perspective of linguistic research. A step towards multidisciplinary research combining research in management and linguistics.

Keywords: international marketing, marketing mix, marketing research, small and medium-sized enterprises, strategic marketing, B2B digital marketing strategy, digital marketing, digital marketing maturity model, SWOT analysis consumer behavior, experience, experience marketing, marketing employee organizational performance, internal marketing, internal customer, direct marketing, mobile phones mobile marketing, Sms advertising.

Procedia PDF Downloads 21
363 Performance Assessment of Carrier Aggregation-Based Indoor Mobile Networks

Authors: Viktor R. Stoynov, Zlatka V. Valkova-Jarvis

Abstract:

The intelligent management and optimisation of radio resource technologies will lead to a considerable improvement in the overall performance in Next Generation Networks (NGNs). Carrier Aggregation (CA) technology, also known as Spectrum Aggregation, enables more efficient use of the available spectrum by combining multiple Component Carriers (CCs) in a virtual wideband channel. LTE-A (Long Term Evolution–Advanced) CA technology can combine multiple adjacent or separate CCs in the same band or in different bands. In this way, increased data rates and dynamic load balancing can be achieved, resulting in a more reliable and efficient operation of mobile networks and the enabling of high bandwidth mobile services. In this paper, several distinct CA deployment strategies for the utilisation of spectrum bands are compared in indoor-outdoor scenarios, simulated via the recently-developed Realistic Indoor Environment Generator (RIEG). We analyse the performance of the User Equipment (UE) by integrating the average throughput, the level of fairness of radio resource allocation, and other parameters, into one summative assessment termed a Comparative Factor (CF). In addition, comparison of non-CA and CA indoor mobile networks is carried out under different load conditions: varying numbers and positions of UEs. The experimental results demonstrate that the CA technology can improve network performance, especially in the case of indoor scenarios. Additionally, we show that an increase of carrier frequency does not necessarily lead to improved CF values, due to high wall-penetration losses. The performance of users under bad-channel conditions, often located in the periphery of the cells, can be improved by intelligent CA location. Furthermore, a combination of such a deployment and effective radio resource allocation management with respect to user-fairness plays a crucial role in improving the performance of LTE-A networks.

Keywords: comparative factor, carrier aggregation, indoor mobile network, resource allocation

Procedia PDF Downloads 158
362 Enhancement of Long Term Peak Demand Forecast in Peninsular Malaysia Using Hourly Load Profile

Authors: Nazaitul Idya Hamzah, Muhammad Syafiq Mazli, Maszatul Akmar Mustafa

Abstract:

The peak demand forecast is crucial to identify the future generation plant up needed in the long-term capacity planning analysis for Peninsular Malaysia as well as for the transmission and distribution network planning activities. Currently, peak demand forecast (in Mega Watt) is derived from the generation forecast by using load factor assumption. However, a forecast using this method has underperformed due to the structural changes in the economy, emerging trends and weather uncertainty. The dynamic changes of these drivers will result in many possible outcomes of peak demand for Peninsular Malaysia. This paper will look into the independent model of peak demand forecasting. The model begins with the selection of driver variables to capture long-term growth. This selection and construction of variables, which include econometric, emerging trend and energy variables, will have an impact on the peak forecast. The actual framework begins with the development of system energy and load shape forecast by using the system’s hourly data. The shape forecast represents the system shape assuming all embedded technology and use patterns to continue in the future. This is necessary to identify the movements in the peak hour or changes in the system load factor. The next step would be developing the peak forecast, which involves an iterative process to explore model structures and variables. The final step is combining the system energy, shape, and peak forecasts into the hourly system forecast then modifying it with the forecast adjustments. Forecast adjustments are among other sales forecasts for electric vehicles, solar and other adjustments. The framework will result in an hourly forecast that captures growth, peak usage and new technologies. The advantage of this approach as compared to the current methodology is that the peaks capture new technology impacts that change the load shape.

Keywords: hourly load profile, load forecasting, long term peak demand forecasting, peak demand

Procedia PDF Downloads 138
361 The Use of Space Syntax in Urban Transportation Planning and Evaluation: Limits and Potentials

Authors: Chuan Yang, Jing Bie, Yueh-Lung Lin, Zhong Wang

Abstract:

Transportation planning is an academic integration discipline combining research and practice with the aim of mobility and accessibility improvements at both strategic-level policy-making and operational dimensions of practical planning. Transportation planning could build the linkage between traffic and social development goals, for instance, economic benefits and environmental sustainability. The transportation planning analysis and evaluation tend to apply empirical quantitative approaches with the guidance of the fundamental principles, such as efficiency, equity, safety, and sustainability. Space syntax theory has been applied in the spatial distribution of pedestrian movement or vehicle flow analysis, however rare has been written about its application in transportation planning. The correlated relationship between the variables of space syntax analysis and authentic observations have declared that the urban configurations have a significant effect on urban dynamics, for instance, land value, building density, traffic, crime. This research aims to explore the potentials of applying Space Syntax methodology to evaluate urban transportation planning through studying the effects of urban configuration on cities transportation performance. By literature review, this paper aims to discuss the effects that urban configuration with different degrees of integration and accessibility have on three elementary components of transportation planning - transportation efficiency, transportation safety, and economic agglomeration development - via intensifying and stabilising the nature movements generated by the street network. And then the potential and limits of Space Syntax theory to study the performance of urban transportation and transportation planning would be discussed in the paper. In practical terms, this research will help future research explore the effects of urban design on transportation performance, and identify which patterns of urban street networks would allow for most efficient and safe transportation performance with higher economic benefits.

Keywords: transportation planning, space syntax, economic agglomeration, transportation efficiency, transportation safety

Procedia PDF Downloads 170
360 Comparison and Effectiveness of Cranial Electrical Stimulation Treatment, Brain Training and Their Combination on Language and Verbal Fluency of Patients with Mild Cognitive Impairment: A Single Subject Design

Authors: Firoozeh Ghazanfari, Kourosh Amraei, Parisa Poorabadi

Abstract:

Mild cognitive impairment is one of the neurocognitive disorders that go beyond age-related decline in cognitive functions, but in fact, it is not so severe which affects daily activities. This study aimed to investigate and compare the effectiveness of treatment with cranial electrical stimulation, brain training and their double combination on the language and verbal fluency of the elderly with mild cognitive impairment. This is a single-subject method with comparative intervention designs. Four patients with a definitive diagnosis of mild cognitive impairment by a psychiatrist were selected via purposive and convenience sampling method. Addenbrooke's Cognitive Examination Scale (2017) was used to assess language and verbal fluency. Two groups were formed with different order of cranial electrical stimulation treatment, brain training by pencil and paper method and their double combination, and two patients were randomly replaced in each group. The arrangement of the first group included cranial electrical stimulation, brain training, double combination and the second group included double combination, cranial electrical stimulation and brain training, respectively. Treatment plan included: A1, B, A2, C, A3, D, A4, where electrical stimulation treatment was given in ten 30-minutes sessions (5 mA and frequency of 0.5-500 Hz) and brain training in ten 30-minutes sessions. Each baseline lasted four weeks. Patients in first group who first received cranial electrical stimulation treatment showed a higher percentage of improvement in the language and verbal fluency subscale of Addenbrooke's Cognitive Examination in comparison to patients of the second group. Based on the results, it seems that cranial electrical stimulation with its effect on neurotransmitters and brain blood flow, especially in the brain stem, may prepare the brain at the neurochemical and molecular level for a better effectiveness of brain training at the behavioral level, and the selective treatment of electrical stimulation solitude in the first place may be more effective than combining it with paper-pencil brain training.

Keywords: cranial electrical stimulation, treatment, brain training, verbal fluency, cognitive impairment

Procedia PDF Downloads 67