Search results for: drying techniques
3718 N₂O₂ Salphen-Like Ligand and Its Pd(II), Ag(I) and Cu(II) Complexes as Potentially Anticancer Agents: Design, Synthesis, Antimicrobial, CT-DNA Binding and Molecular Docking
Authors: Laila H. Abdel-Rahman, Mohamed Shaker S. Adam, Ahmed M. Abu-Dief, Hanan El-Sayed Ahmed
Abstract:
In this investigation, Cu(II), Pd(II) and Ag(I) complexes with the tetra-dentate DSPH Schiff base ligand were synthesized. The DSPH Schiff base and its complexes were characterized by using different physicochemical and spectral analysis. The results revealed that the metal ions coordinated with DSPH ligand through azomethine nitrogen and phenolic oxygen. Cu(II), Pd(II) and Ag(I) complexes are present in a 1:1 molar ratio. Pd(II) and Ag(I) complexes have square planar geometries while, Cu(II) has a distorted octahedral (Oh) geometry. All investigated complexes are nonelectrolytes. The investigated compounds were tested against different strains of bacteria and fungi. Both prepared compounds showed good results of inhibition against the selected pathogenic microorganism. Moreover, the interaction of investigated complexes with CT-DNA was studied via various techniques and the binding modes are mainly intercalative and grooving modes. Operating Environment MOE package was used to do docking studies for the investigated complexes to explore the potential binding mode and energy. Furthermore, the growth inhibitory effect of the investigated compounds was examined on some cancer cells lines.Keywords: tetradentate, antimicrobial, CT-DNA interaction, docking, anticancer
Procedia PDF Downloads 2443717 Optimization of Biodiesel Production from Palm Oil over Mg-Al Modified K-10 Clay Catalyst
Authors: Muhammad Ayoub, Abrar Inayat, Bhajan Lal, Sintayehu Mekuria Hailegiorgis
Abstract:
Biodiesel which comes from pure renewable resources provide an alternative fuel option for future because of limited fossil fuel resources as well as environmental concerns. The transesterification of vegetable oils for biodiesel production is a promising process to overcome this future crises of energy. The use of heterogeneous catalysts greatly simplifies the technological process by facilitating the separation of the post-reaction mixture. The purpose of the present work was to examine a heterogeneous catalyst, in particular, Mg-Al modified K-10 clay, to produce methyl esters of palm oil. The prepared catalyst was well characterized by different latest techniques. In this study, the transesterification of palm oil with methanol was studied in a heterogeneous system in the presence of Mg-Al modified K-10 clay as solid base catalyst and then optimized these results with the help of Design of Experiments software. The results showed that methanol is the best alcohol for this reaction condition. The best results was achieved for optimization of biodiesel process. The maximum conversion of triglyceride (88%) was noted after 8 h of reaction at 60 ̊C, with a 6:1 molar ratio of methanol to palm oil and 3 wt % of prepared catalyst.Keywords: palm oil, transestrefication, clay, biodiesel, mesoporous clay, K-10
Procedia PDF Downloads 3963716 D6tions: A Serious Game to Learn Software Engineering Process and Design
Authors: Hector G. Perez-Gonzalez, Miriam Vazquez-Escalante, Sandra E. Nava-Muñoz, Francisco E. Martinez-Perez, Alberto S. Nunez-Varela
Abstract:
The software engineering teaching process has been the subject of many studies. To improve this process, researchers have proposed merely illustrative techniques in the classroom, such as topic presentations and dynamics between students on one side or attempts to involve students in real projects with companies and institutions to bring them to a real software development problem on the other hand. Simulators and serious games have been used as auxiliary tools to introduce students to topics that are too abstract when these are presented in the traditional way. Most of these tools cover a limited area of the huge software engineering scope. To address this problem, we have developed D6tions, an educational serious game that simulates the software engineering process and is designed to experiment the different stages a software engineer (playing roles as project leader or as a developer or designer) goes through, while participating in a software project. We describe previous approaches to this problem, how D6tions was designed, its rules, directions, and the results we obtained of the use of this game involving undergraduate students playing the game.Keywords: serious games, software engineering, software engineering education, software engineering teaching process
Procedia PDF Downloads 4933715 The Patterns Designation by the Inspiration from Flower at Suan Sunandha Palace
Authors: Nawaporn Srisarankullawong
Abstract:
This research is about the creating the design by the inspiration of the flowers, which were once planted in Suan Sunandha Palace. The researcher have conducted the research regarding the history of Suan Sunandha Palace and the flowers which have been planted in the palace’s garden, in order to use this research to create the new designs in the future. The objective are as follows; 1. To study the shape and the pattern of the flowers in Suan Sunandha Palace, in order to select a few of them as the model to create the new design. 2. In order to create the flower design from the flowers in Suan Sunandha Palace by using the current photograph of the flowers which were once used to be planted inside the palace and using adobe Illustrator and Adobe Photoshop programs to create the patterns and the model. The result of the research: From the research, the researcher had selected three types of flowers to crate the pattern model; they are Allamanda, Orchids and Flamingo Plant. The details of the flowers had been reduced in order to show the simplicity and create the pattern model to use them for models, so three flowers had created three pattern models and they had been developed into six patterns, using universal artist techniques, so the pattern created are modern and they can be used for further decoration.Keywords: patterns design, Suan Sunandha Palace, pattern of the flowers, visual arts and design
Procedia PDF Downloads 3743714 Towards a Distributed Computation Platform Tailored for Educational Process Discovery and Analysis
Authors: Awatef Hicheur Cairns, Billel Gueni, Hind Hafdi, Christian Joubert, Nasser Khelifa
Abstract:
Given the ever changing needs of the job markets, education and training centers are increasingly held accountable for student success. Therefore, education and training centers have to focus on ways to streamline their offers and educational processes in order to achieve the highest level of quality in curriculum contents and managerial decisions. Educational process mining is an emerging field in the educational data mining (EDM) discipline, concerned with developing methods to discover, analyze and provide a visual representation of complete educational processes. In this paper, we present our distributed computation platform which allows different education centers and institutions to load their data and access to advanced data mining and process mining services. To achieve this, we present also a comparative study of the different clustering techniques developed in the context of process mining to partition efficiently educational traces. Our goal is to find the best strategy for distributing heavy analysis computations on many processing nodes of our platform.Keywords: educational process mining, distributed process mining, clustering, distributed platform, educational data mining, ProM
Procedia PDF Downloads 4543713 Reimagining Landscapes: Psychological Responses and Behavioral Shifts in the Aftermath of the Lytton Creek Fire
Authors: Tugba Altin
Abstract:
In an era where the impacts of climate change resonate more pronouncedly than ever, communities globally grapple with events bearing both tangible and intangible ramifications. Situating this within the evolving landscapes of Psychological and Behavioral Sciences, this research probes the profound psychological and behavioral responses evoked by such events. The Lytton Creek Fire of 2021 epitomizes these challenges. While tangible destruction is immediate and evident, the intangible repercussions—emotional distress, disintegration of cultural landscapes, and disruptions in place attachment (PA)—require meticulous exploration. PA, emblematic of the emotional and cognitive affiliations individuals nurture with their environments, emerges as a cornerstone for comprehending how environmental cataclysms influence cultural identity and bonds to land. This study, harmonizing the core tenets of an interpretive phenomenological approach with a hermeneutic framework, underscores the pivotal nature of this attachment. It delves deep into the realm of individuals' experiences post the Lytton Creek Fire, unraveling the intricate dynamics of PA amidst such calamity. The study's methodology deviates from conventional paradigms. Instead of traditional interview techniques, it employs walking audio sessions and photo elicitation methods, granting participants the agency to immerse, re-experience, and vocalize their sentiments in real-time. Such techniques shed light on spatial narratives post-trauma and capture the otherwise elusive emotional nuances, offering a visually rich representation of place-based experiences. Central to this research is the voice of the affected populace, whose lived experiences and testimonies form the nucleus of the inquiry. As they renegotiate their bonds with transformed environments, their narratives reveal the indispensable role of cultural landscapes in forging place-based identities. Such revelations accentuate the necessity of integrating both tangible and intangible trauma facets into community recovery strategies, ensuring they resonate more profoundly with affected individuals. Bridging the domains of environmental psychology and behavioral sciences, this research accentuates the intertwined nature of tangible restoration with the imperative of emotional and cultural recuperation post-environmental disasters. It advocates for adaptation initiatives that are rooted in the lived realities of the affected, emphasizing a holistic approach that recognizes the profundity of human connections to landscapes. This research advocates the interdisciplinary exchange of ideas and strategies in addressing post-disaster community recovery strategies. It not only enriches the climate change discourse by emphasizing the human facets of disasters but also reiterates the significance of an interdisciplinary approach, encompassing psychological and behavioral nuances, for fostering a comprehensive understanding of climate-induced traumas. Such a perspective is indispensable for shaping more informed, empathetic, and effective adaptation strategies.Keywords: place attachment, community recovery, disaster response, restorative landscapes, sensory response, visual methodologies
Procedia PDF Downloads 593712 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy
Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie
Abstract:
In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.Keywords: data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data
Procedia PDF Downloads 3203711 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling
Procedia PDF Downloads 4313710 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques
Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah
Abstract:
Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.Keywords: BIM, construction projects, cost estimation, NRM, ontology
Procedia PDF Downloads 5513709 Optimization of Lean Methodologies in the Textile Industry Using Design of Experiments
Authors: Ahmad Yame, Ahad Ali, Badih Jawad, Daw Al-Werfalli Mohamed Nasser, Sabah Abro
Abstract:
Industries in general have a lot of waste. Wool textile company, Baniwalid, Libya has many complex problems that led to enormous waste generated due to the lack of lean strategies, expertise, technical support and commitment. To successfully address waste at wool textile company, this study will attempt to develop a methodical approach that integrates lean manufacturing tools to optimize performance characteristics such as lead time and delivery. This methodology will utilize Value Stream Mapping (VSM) techniques to identify the process variables that affect production. Once these variables are identified, Design of Experiments (DOE) Methodology will be used to determine the significantly influential process variables, these variables are then controlled and set at their optimal to achieve optimal levels of productivity, quality, agility, efficiency and delivery to analyze the outputs of the simulation model for different lean configurations. The goal of this research is to investigate how the tools of lean manufacturing can be adapted from the discrete to the continuous manufacturing environment and to evaluate their benefits at a specific industrial.Keywords: lean manufacturing, DOE, value stream mapping, textiles
Procedia PDF Downloads 4553708 Nanohybrids for Energy Storage Devices
Authors: O. Guellati, A. Harat, F. Djefaflia, N. Habib, A. Nait-Merzoug, J. El Haskouri, D. Momodu, N. Manyala, D. Bégin, M. Guerioune
Abstract:
We report a facile and low-cost free-template synthesis method was used to synthesize mesoporous smart multifunctional nanohybrids based on Graphene/PANI nanofibers micro/nanostructures with very interesting physic-chemical properties and faradic electrochemical behavior of these products was investigated. These nanohybrid products have been characterized quantitatively and qualitatively using different techniques, such as XRD / FTIR, Raman, XPS spectroscopy, Field Emission SEM and High-Resolution TEM microscopy, BET textural analysis, electrochemical measurements (CV, CD, EIS). Moreover, the electrochemical measurements performed in a 6 M KOH aqueous electrolyte depicted excellent electrochemical performance ascribed to the optimized composition of hydroxides et PANI nanofibers. An exceptionally notable specific capacitance between 800 and 2000 F. g-1 was obtained at 5 mV. s-1 scan rate for these synthesized products depends on the optimized growth conditions. We found much better nanohybrids by reinforcing hydroxides or conduction polymer nanofibers with carbonaceous nanomaterials depicting their potential as suitable materials for energy storage devices.Keywords: nanohybrid materials, conducting polymers, carbonaceous nanomaterials, supercapacitors, energy storage
Procedia PDF Downloads 713707 Vibrancy in The City: The Problem of Sidi-Gaber Station Zone in Alexandria, Egypt
Authors: Gihan Mosaad, Bakr Gomaa, Rana Elbadri
Abstract:
Modern parts of Alexandria city lack in vibrancy, causing a number of problems such as urban areas with poor security measures as well as weak economic state. Vibrancy provides a livable, attractive and secure environments; it also boosts the city’s economy and social life. Vibrant city is a city full of energy and life. To achieve this, a number of resources are needed; namely specific urban density, the availability of alternative modes of transportation and finally diversity of land-uses. Literature review shows no comprehensive study that assesses vibrancy in the streets of modern Alexandria. This study aims to measure the vibrancy potential in Sidi-Gaber station area thought the assessment of existing resources performance. Methods include literature reviews, surveying of existing case, questionnaire as well as GIS techniques. Expected results include GIS maps defining the vibrancy potentials in land use, density and statistical study regarding public transportation use in the area.Keywords: Alexandria, density, mixed use, transportation, vibrancy
Procedia PDF Downloads 2933706 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach
Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann
Abstract:
Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech
Procedia PDF Downloads 1023705 The Application of Conceptual Metaphor Theory to the Treatment of Depression
Abstract:
Conceptual Metaphor Theory (CMT) proposes that metaphor is fundamental to human thought. CMT utilizes embodied cognition, in that emotions are conceptualized as effects on the body because of a coupling of one’s bodily experiences and one’s somatosensory system. Time perception is a function of embodied cognition and conceptual metaphor in that one’s experience of time is inextricably dependent on one’s perception of the world around them. A hallmark of depressive disorders is the distortion in one’s perception of time, such as neurological dysfunction and psychomotor retardation, and yet, to the author’s best knowledge, previous studies have not before linked CMT, embodied cognition, and depressive disorders. Therefore, the focus of this paper is the investigation of how the applications of CMT and embodied cognition (especially regarding time perception) have promise in improving current techniques to treat depressive disorders. This paper aimed to extend, through a thorough review of literature, the theoretical basis required to further research into CMT and embodied cognition’s application in treating time distortion related symptoms of depressive disorders. Future research could include the development of brain training technologies that capitalize on the principles of CMT, with the aim of promoting cognitive remediation and cognitive activation to mitigate symptoms of depressive disorder.Keywords: depression, conceptual metaphor theory, embodied cognition, time
Procedia PDF Downloads 1623704 Neural Network Approach to Classifying Truck Traffic
Authors: Ren Moses
Abstract:
The process of classifying vehicles on a highway is hereby viewed as a pattern recognition problem in which connectionist techniques such as artificial neural networks (ANN) can be used to assign vehicles to their correct classes and hence to establish optimum axle spacing thresholds. In the United States, vehicles are typically classified into 13 classes using a methodology commonly referred to as “Scheme F”. In this research, the ANN model was developed, trained, and applied to field data of vehicles. The data comprised of three vehicular features—axle spacing, number of axles per vehicle, and overall vehicle weight. The ANN reduced the classification error rate from 9.5 percent to 6.2 percent when compared to an existing classification algorithm that is not ANN-based and which uses two vehicular features for classification, that is, axle spacing and number of axles. The inclusion of overall vehicle weight as a third classification variable further reduced the error rate from 6.2 percent to only 3.0 percent. The promising results from the neural networks were used to set up new thresholds that reduce classification error rate.Keywords: artificial neural networks, vehicle classification, traffic flow, traffic analysis, and highway opera-tions
Procedia PDF Downloads 3093703 River Bank Erosion Studies: A Review on Investigation Approaches and Governing Factors
Authors: Azlinda Saadon
Abstract:
This paper provides detail review on river bank erosion studies with respect to their processes, methods of measurements and factors governing river bank erosion. Bank erosion processes are commonly associated with river changes initiation and development, through width adjustment and planform evolution. It consists of two main types of erosion processes; basal erosion due to fluvial hydraulic force and bank failure under the influence of gravity. Most studies had only focused on one factor rather than integrating both factors. Evidences of previous works have shown integration between both processes of fluvial hydraulic force and bank failure. Bank failure is often treated as probabilistic phenomenon without having physical characteristics and the geotechnical aspects of the bank. This review summarizes the findings of previous investigators with respect to measurement techniques and prediction rates of river bank erosion through field investigation, physical model and numerical model approaches. Factors governing river bank erosion considering physical characteristics of fluvial erosion are defined.Keywords: river bank erosion, bank erosion, dimensional analysis, geotechnical aspects
Procedia PDF Downloads 4353702 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 403701 KCBA, A Method for Feature Extraction of Colonoscopy Images
Authors: Vahid Bayrami Rad
Abstract:
In recent years, the use of artificial intelligence techniques, tools, and methods in processing medical images and health-related applications has been highlighted and a lot of research has been done in this regard. For example, colonoscopy and diagnosis of colon lesions are some cases in which the process of diagnosis of lesions can be improved by using image processing and artificial intelligence algorithms, which help doctors a lot. Due to the lack of accurate measurements and the variety of injuries in colonoscopy images, the process of diagnosing the type of lesions is a little difficult even for expert doctors. Therefore, by using different software and image processing, doctors can be helped to increase the accuracy of their observations and ultimately improve their diagnosis. Also, by using automatic methods, the process of diagnosing the type of disease can be improved. Therefore, in this paper, a deep learning framework called KCBA is proposed to classify colonoscopy lesions which are composed of several methods such as K-means clustering, a bag of features and deep auto-encoder. Finally, according to the experimental results, the proposed method's performance in classifying colonoscopy images is depicted considering the accuracy criterion.Keywords: colorectal cancer, colonoscopy, region of interest, narrow band imaging, texture analysis, bag of feature
Procedia PDF Downloads 573700 Application of Two Stages Adaptive Neuro-Fuzzy Inference System to Improve Dissolved Gas Analysis Interpretation Techniques
Authors: Kharisma Utomo Mulyodinoto, Suwarno, A. Abu-Siada
Abstract:
Dissolved Gas Analysis is one of impressive technique to detect and predict internal fault of transformers by using gas generated by transformer oil sample. A number of methods are used to interpret the dissolved gas from transformer oil sample: Doernenberg Ratio Method, IEC (International Electrotechnical Commission) Ratio Method, and Duval Triangle Method. While the assessment of dissolved gas within transformer oil samples has been standardized over the past two decades, analysis of the results is not always straight forward as it depends on personnel expertise more than mathematical formulas. To get over this limitation, this paper is aimed at improving the interpretation of Doernenberg Ratio Method, IEC Ratio Method, and Duval Triangle Method using Two Stages Adaptive Neuro-Fuzzy Inference System (ANFIS). Dissolved gas analysis data from 520 faulty transformers was analyzed to establish the proposed ANFIS model. Results show that the developed ANFIS model is accurate and can standardize the dissolved gas interpretation process with accuracy higher than 90%.Keywords: ANFIS, dissolved gas analysis, Doernenberg ratio method, Duval triangular method, IEC ratio method, transformer
Procedia PDF Downloads 1473699 Questioning Eugenics and the Dignity of the Human Person in the Age of Science Technology
Authors: Ephraim Ibekwe
Abstract:
The field of biomedical science has offered modern man more options to choose from than ever before about what their future children will be or look like. Today, embryo selection techniques, for instance, has availed most people the power to choose the sex of their child, to avoid the birth of a child with a disability, or even to choose deliberately to create a disabled child. With new biotechnological tools emerging daily, many people deem parents personally and socially responsible for the results of their choosing to bear children, i.e. all tests should be done, and parents are responsible for only “keeping” healthy children. Some fear parents may soon be left to their own devices if they have children who require extra time and social spending. As with other discoveries in the area of genetic engineering, such possibilities raise important ethical issues – questions about which of these choices are morally permissible or morally wrong. Hence, the preoccupation of this article is to understand the extent to which the questions that Eugenics posits on the human person can be answered with keen clarity. With an analytical posture, this article, while not deriding the impact of biotechnology and the medical sciences, argues for Human dignity in its strictest consideration.Keywords: dignity, eugenics, human person, technology and biomedical science
Procedia PDF Downloads 1403698 Imports of Intermediate Inputs: A Study of the Main Research Streams
Authors: Marta Fernández Olmos, Jorge Fleta, Talia Gómez
Abstract:
This article shares the results of a temporal analysis of the literature on imports of intermediate inputs based on review techniques. The aim of this paper is to identify the main lines of research, their trends, topics, and the research agenda. The internationalization field has attracted considerable scholars and practitioners’ attention in recent years and has grown, rapidly, resulting in a large body of knowledge scattered in different areas of specialization. However, there are no studies that are entirely restricted to imports, intermediate inputs and innovation performance. The performance analysis provided an updated overview of the evolution of the importing literature from 1970 to 2022 and quantitatively identified the most productive and influential journals, articles, authors, and countries. The results show that the current topics are mainly based on modes of importing, innovation performance of importing intermediate imports and collaborations. Future lines of research are identified from topics with lower co-occurrence, such as artificial intelligence, entrepreneurship, and alternative business models such as multinational enterprises (MNEs) versus non-MNEs.Keywords: imports, intermediate inputs, innovation performance, review
Procedia PDF Downloads 743697 Chinese Language Teaching as a Second Language: Immersion Teaching
Authors: Lee Bih Ni, Kiu Su Na
Abstract:
This paper discusses the Chinese Language Teaching as a Second Language by focusing on Immersion Teaching. Researchers used narrative literature review to describe the current states of both art and science in focused areas of inquiry. Immersion teaching comes with a standard that teachers must reliably meet. Chinese language-immersion instruction consists of language and content lessons, including functional usage of the language, academic language, authentic language, and correct Chinese sociocultural language. Researchers used narrative literature reviews to build a scientific knowledge base. Researchers collected all the important points of discussion, and put them here with reference to the specific field where this paper is originally based on. The findings show that Chinese Language in immersion teaching is not like standard foreign language classroom; immersion setting provides more opportunities to teach students colloquial language than academic. Immersion techniques also introduce a language’s cultural and social contexts in a meaningful and memorable way. It is particularly important that immersion teachers connect classwork with real-life experiences. Immersion also includes more elements of discovery and inquiry based learning than do other kinds of instructional practices. Students are always and consistently interpreted the conclusions and context clues.Keywords: a second language, Chinese language teaching, immersion teaching, instructional strategies
Procedia PDF Downloads 4523696 CFD Study on the Effect of Primary Air on Combustion of Simulated MSW Process in the Fixed Bed
Authors: Rui Sun, Tamer M. Ismail, Xiaohan Ren, M. Abd El-Salam
Abstract:
Incineration of municipal solid waste (MSW) is one of the key scopes in the global clean energy strategy. A computational fluid dynamics (CFD) model was established. In order to reveal these features of the combustion process in a fixed porous bed of MSW. Transporting equations and process rate equations of the waste bed were modeled and set up to describe the incineration process, according to the local thermal conditions and waste property characters. Gas phase turbulence was modeled using k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The heterogeneous reaction rates were determined using Arrhenius eddy dissipation and the Arrhenius-diffusion reaction rates. The effects of primary air flow rate and temperature in the burning process of simulated MSW are investigated experimentally and numerically. The simulation results in bed are accordant with experimental data well. The model provides detailed information on burning processes in the fixed bed, which is otherwise very difficult to obtain by conventional experimental techniques.Keywords: computational fluid dynamics (CFD) model, waste incineration, municipal solid waste (MSW), fixed bed, primary air
Procedia PDF Downloads 4023695 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition
Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can
Abstract:
To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning
Procedia PDF Downloads 853694 A Geographical Framework for Studying the Territorial Sustainability Based on Land Use Change
Authors: Miguel Ramirez, Ivan Lizarazo
Abstract:
The emergence of various interpretations of sustainability, including weak and strong paradigms, can be traced back to the definition of sustainable development provided in the 1987 Brundtland report and the subsequent evolution of the sustainability concept. However, there has been limited scholarly attention given to clarifying the concept of sustainability within the theoretical and conceptual framework of geography. The discipline has predominantly been focused on understanding the diverse conceptions of sustainability within its epistemological boundaries, resulting in tensions between sustainability paradigms and their associated dimensions, including the incorporation of political perspectives, with particular emphasis on environmental geography's epistemology. In response to this gap, a conceptual framework for sustainability is proposed, effectively integrating spatial and territorial concepts. This framework aims to enhance geography's role in contributing to sustainability by utilizing the land system theory, which is based on the dynamics of land use change. Such an integrated conceptual framework enables incorporating methodological tools such as remote sensing, encompassing various earth observations and fusion methods, and supervised classification techniques. Additionally, it looks for better integration of socioecological information, thereby capturing essential population-related features.Keywords: geography, sustainability, land change science, territorial sustainability
Procedia PDF Downloads 803693 Developing a Methodology to Examine Psychophysiological Responses during Stress Exposure and Relaxation: An Experimental Paradigm
Authors: M. Velana, G. Rinkenauer
Abstract:
Nowadays, nurses are facing unprecedented amounts of pressure due to the ongoing global health demands. Work-related stress can cause a high physical and psychological workload, which can lead, in turn, to burnout. On the physiological level, stress triggers an initial activation of the sympathetic nervous and adrenomedullary systems resulting in increases in cardiac activity. Furthermore, activation of the hypothalamus-pituitary-adrenal axis provokes endocrine and immune changes leading to the release of cortisol and cytokines in an effort to re-establish body balance. Based on the current state of the literature, it has been identified that resilience and mindfulness exercises among nurses can effectively decrease stress and improve mood. However, it is still unknown what relaxation techniques would be suitable for and to what extent would be effective to decrease psychophysiological arousal deriving from either a physiological or a psychological stressor. Moreover, although cardiac activity and cortisol are promising candidates to examine the effectiveness of relaxation to reduce stress, it still remains to shed light on the role of cytokines in this process so as to thoroughly understand the body’s response to stress and to relaxation. Therefore, the main aim of the present study is to develop a comprehensive experimental paradigm and assess different relaxation techniques, namely progressive muscle relaxation and a mindfulness exercise originating from cognitive therapy by means of biofeedback, under highly controlled laboratory conditions. An experimental between-subject design will be employed, where 120 participants will be randomized either to a physiological or a psychological stress-related experiment. Particularly, the cold pressor test refers to a procedure in which the participants have to immerse their non-dominant hands into ice water (2-3 °C) for 3 min. The participants are requested to keep their hands in the water throughout the whole duration. However, they can immediately terminate the test in case it would be barely tolerable. A pre-test anticipation phase and a post-stress period of 3 min, respectively, are planned. The Trier Social Stress Test will be employed to induce psychological stress. During this laboratory stressor, the participants are instructed to give a 5-min speech in front of a committee of communication specialists. Before the main task, there is a 10-min anticipation period. Subsequently, participants are requested to perform an unexpected arithmetic task. After stress exposure, the participants will perform one of the relaxation exercises (treatment condition) or watch a neutral video (control condition). Electrocardiography, salivary samples, and self-report will be collected at different time points. The preliminary results deriving from the pilot study showed that the aforementioned paradigm could effectively induce stress reactions and that relaxation might decrease the impact of stress exposure. It is of utmost importance to assess how the human body responds under different stressors and relaxation exercises so that an evidence-based intervention could be transferred in a clinical setting to improve nurses’ general health. Based on suggestive future laboratory findings, the research group plans to conduct a pilot-level randomized study to decrease stress and promote well-being among nurses who work in the stress-riddled environment of a hospital located in Northern Germany.Keywords: nurses, psychophysiology, relaxation, stress
Procedia PDF Downloads 1103692 A Framework for Blockchain Vulnerability Detection and Cybersecurity Education
Authors: Hongmei Chi
Abstract:
The Blockchain has become a necessity for many different societal industries and ordinary lives including cryptocurrency technology, supply chain, health care, public safety, education, etc. Therefore, training our future blockchain developers to know blockchain programming vulnerability and I.T. students' cyber security is in high demand. In this work, we propose a framework including learning modules and hands-on labs to guide future I.T. professionals towards developing secure blockchain programming habits and mitigating source code vulnerabilities at the early stages of the software development lifecycle following the concept of Secure Software Development Life Cycle (SSDLC). In this research, our goal is to make blockchain programmers and I.T. students aware of the vulnerabilities of blockchains. In summary, we develop a framework that will (1) improve students' skills and awareness of blockchain source code vulnerabilities, detection tools, and mitigation techniques (2) integrate concepts of blockchain vulnerabilities for IT students, (3) improve future IT workers’ ability to master the concepts of blockchain attacks.Keywords: software vulnerability detection, hands-on lab, static analysis tools, vulnerabilities, blockchain, active learning
Procedia PDF Downloads 993691 An Overview of Risk Types and Risk Management Strategies to Improve Financial Performance
Authors: Azar Baghtaghi
Abstract:
Financial risk management is critically important as it enables companies to maintain stability and profitability amidst market fluctuations and unexpected events. It involves the precise identification of risks that could impact investments, assets, and potential revenues. By implementing effective risk management strategies, companies can insure themselves against adverse market changes and prevent potential losses. In today's era, where markets are highly complex and influenced by various factors such as macroeconomic policies, exchange rate fluctuations, and natural disasters, the need for meticulous planning to cope with these uncertainties is more pronounced. Ultimately, financial risk management means being prepared for the future and the ability to sustain business in changing environments. A company capable of managing its risks not only achieves sustainable profitability but also gains the confidence of shareholders, investors, and business partners, enhancing its competitive position in the market. In this article, the types of financial risk and risk management strategies for improving financial performance were investigated. By identifying the risks stated in this article and their evaluation techniques, it is possible to improve the organization's financial performance.Keywords: strategy, risk, risk management, financial performance.
Procedia PDF Downloads 93690 Relating Symptoms with Protein Production Abnormality in Patients with Down Syndrome
Authors: Ruolan Zhou
Abstract:
Trisomy of human chromosome 21 is the primary cause of Down Syndrome (DS), and this genetic disease has significantly burdened families and countries, causing great controversy. To address this problem, the research takes an approach in exploring the relationship between genetic abnormality and this disease's symptoms, adopting several techniques, including data analysis and enrichment analysis. It also explores open-source websites, such as NCBI, DAVID, SOURCE, STRING, as well as UCSC, to complement its result. This research has analyzed the variety of genes on human chromosome 21 with simple coding, and by using analysis, it has specified the protein-coding genes, their function, and their location. By using enrichment analysis, this paper has found the abundance of keratin production-related coding-proteins on human chromosome 21. By adopting past researches, this research has attempted to disclose the relationship between trisomy of human chromosome 21 and keratin production abnormality, which might be the reason for common diseases in patients with Down Syndrome. At last, by addressing the advantage and insufficiency of this research, the discussion has provided specific directions for future research.Keywords: Down Syndrome, protein production, genome, enrichment analysis
Procedia PDF Downloads 1263689 Bandwidth Efficient Cluster Based Collision Avoidance Multicasting Protocol in VANETs
Authors: Navneet Kaur, Amarpreet Singh
Abstract:
In Vehicular Adhoc Networks, Data Dissemination is a challenging task. There are number of techniques, types and protocols available for disseminating the data but in order to preserve limited bandwidth and to disseminate maximum data over networks makes it more challenging. There are broadcasting, multicasting and geocasting based protocols. Multicasting based protocols are found to be best for conserving the bandwidth. One such protocol named BEAM exists that improves the performance of Vehicular Adhoc Networks by reducing the number of in-network message transactions and thereby efficiently utilizing the bandwidth during an emergency situation. But this protocol may result in multicar chain collision as there was no V2V communication. So, this paper proposes a new protocol named Enhanced Bandwidth Efficient Cluster Based Multicasting Protocol (EBECM) that will overcome the limitations of existing BEAM protocol. And Simulation results will show the improved performance of EBECM in terms of Routing overhead, throughput and PDR when compared with BEAM protocol.Keywords: BEAM, data dissemination, emergency situation, vehicular adhoc network
Procedia PDF Downloads 348