Search results for: edge application framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13069

Search results for: edge application framework

9949 ISMARA: Completely Automated Inference of Gene Regulatory Networks from High-Throughput Data

Authors: Piotr J. Balwierz, Mikhail Pachkov, Phil Arnold, Andreas J. Gruber, Mihaela Zavolan, Erik van Nimwegen

Abstract:

Understanding the key players and interactions in the regulatory networks that control gene expression and chromatin state across different cell types and tissues in metazoans remains one of the central challenges in systems biology. Our laboratory has pioneered a number of methods for automatically inferring core gene regulatory networks directly from high-throughput data by modeling gene expression (RNA-seq) and chromatin state (ChIP-seq) measurements in terms of genome-wide computational predictions of regulatory sites for hundreds of transcription factors and micro-RNAs. These methods have now been completely automated in an integrated webserver called ISMARA that allows researchers to analyze their own data by simply uploading RNA-seq or ChIP-seq data sets and provides results in an integrated web interface as well as in downloadable flat form. For any data set, ISMARA infers the key regulators in the system, their activities across the input samples, the genes and pathways they target, and the core interactions between the regulators. We believe that by empowering experimental researchers to apply cutting-edge computational systems biology tools to their data in a completely automated manner, ISMARA can play an important role in developing our understanding of regulatory networks across metazoans.

Keywords: gene expression analysis, high-throughput sequencing analysis, transcription factor activity, transcription regulation

Procedia PDF Downloads 49
9948 A Practical Approach Towards Disinfection Challenges in Sterile Manufacturing Area

Authors: Doris Lacej, Eni Bushi

Abstract:

Cleaning and disinfection procedures are essential for maintaining the cleanliness status of the pharmaceutical manufacturing environment particularly of the cleanrooms and sterile unit area. The Good Manufacturing Practice (GMP) Annex 1 recommendation highly requires the implementation of the standard and validated cleaning and disinfection protocols. However, environmental monitoring has shown that even a validated cleaning method with certified agents may result in the presence of atypical microorganisms’ colony that exceeds GMP limits for a specific cleanroom area. In response to this issue, this case study aims to arrive at the root cause of the microbial contamination observed in the sterile production environment in Profarma pharmaceutical industry in Albania through applying a problem-solving practical approach that ensures the appropriate sterility grade. The guidelines and literature emphasize the importance of several factors in the prevention of possible microbial contamination occurring in cleanrooms, grade A and C. These factors are integrated into a practical framework, to identify the root cause of the presence of Aspergillus Niger colony in the sterile production environment in Profarma pharmaceutical industry in Albania. In addition, the application of a semi-automatic disinfecting system such as H2O2 FOG into sterile grade A and grade C cleanrooms has been an effective solution in eliminating the atypical colony of Aspergillus Niger. Selecting the appropriate detergents and disinfectants at the right concentration, frequency, and combination; the presence of updated and standardized guidelines for cleaning and disinfection as well as continuous training of operators on these practices in accordance with the updated GMP guidelines are some of the identified factors that influence the success of achieving sterility grade. However, to ensure environmental sustainability it is important to be prepared for identifying the source of contamination and making the appropriate decision. The proposed case-based practical approach may help pharmaceutical companies to achieve sterile production and cleanliness environmental sustainability in challenging situations. Apart from the integration of valid agents and standardized cleaning and disinfection protocols according to GMP Annex 1, pharmaceutical companies must be careful and investigate the source and all the steps that can influence the results of an abnormal situation. Subsequently apart from identifying the root cause it is important to solve the problem with a successful alternative approach.

Keywords: cleanrooms, disinfectants, environmental monitoring, GMP Annex 1

Procedia PDF Downloads 201
9947 Moderate Electric Field and Ultrasound as Alternative Technologies to Raspberry Juice Pasteurization Process

Authors: Cibele F. Oliveira, Debora P. Jaeschke, Rodrigo R. Laurino, Amanda R. Andrade, Ligia D. F. Marczak

Abstract:

Raspberry is well-known as a good source of phenolic compounds, mainly anthocyanin. Some studies pointed out the importance of these bioactive compounds consumption, which is related to the decrease of the risk of cancer and cardiovascular diseases. The most consumed raspberry products are juices, yogurts, ice creams and jellies and, to ensure the safety of these products, raspberry is commonly pasteurized, for enzyme and microorganisms inactivation. Despite being efficient, the pasteurization process can lead to degradation reactions of the bioactive compounds, decreasing the products healthy benefits. Therefore, the aim of the present work was to evaluate moderate electric field (MEF) and ultrasound (US) technologies application on the pasteurization process of raspberry juice and compare the results with conventional pasteurization process. For this, phenolic compounds, anthocyanin content and physical-chemical parameters (pH, color changes, titratable acidity) of the juice were evaluated before and after the treatments. Moreover, microbiological analyses of aerobic mesophiles microorganisms, molds and yeast were performed in the samples before and after the treatments, to verify the potential of these technologies to inactivate microorganisms. All the pasteurization processes were performed in triplicate for 10 min, using a cylindrical Pyrex® vessel with a water jacket. The conventional pasteurization was performed at 90 °C using a hot water bath connected to the extraction cell. The US assisted pasteurization was performed using 423 and 508 W cm-2 (75 and 90 % of ultrasound intensity). It is important to mention that during US application the temperature was kept below 35 °C; for this, the water jacket of the extraction cell was connected to a water bath with cold water. MEF assisted pasteurization experiments were performed similarly to US experiments, using 25 and 50 V. Control experiments were performed at the maximum temperature of US and MEF experiments (35 °C) to evaluate only the effect of the aforementioned technologies on the pasteurization. The results showed that phenolic compounds concentration in the juice was not affected by US and MEF application. However, it was observed that the US assisted pasteurization, performed at the highest intensity, decreased anthocyanin content in 33 % (compared to in natura juice). This result was possibly due to the cavitation phenomena, which can lead to free radicals formation and accumulation on the medium; these radicals can react with anthocyanin decreasing the content of these antioxidant compounds in the juice. Physical-chemical parameters did not present statistical differences for samples before and after the treatments. Microbiological analyses results showed that all the pasteurization treatments decreased the microorganism content in two logarithmic cycles. However, as values were lower than 1000 CFU mL-1 it was not possible to verify the efficacy of each treatment. Thus, MEF and US were considered as potential alternative technologies for pasteurization process, once in the right conditions the application of the technologies decreased microorganism content in the juice and did not affected phenolic and anthocyanin content, as well as physical-chemical parameters. However, more studies are needed regarding the influence of MEF and US processes on microorganisms’ inactivation.

Keywords: MEF, microorganism inactivation, anthocyanin, phenolic compounds

Procedia PDF Downloads 225
9946 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool

Procedia PDF Downloads 353
9945 Intelligence Failures and Infiltration: The Case of the Ethiopian Army 1977-1991

Authors: Fantahun Ibrahim

Abstract:

The Ethiopian army was one of the largest and most heavily armed ground forces in Africa between 1974 and 1991. It scored a decisive victory over Somalia’s armed forces in March 1978. It, however, failed to withstand the combined onslaught of the northern insurgents from Tigray and Eritrea and finally collapsed in 1991. At the heart of the problem was the army’s huge intelligence failure. The northern insurgents, on the other hand, had a cutting edge in intelligence gathering. Among other things they infiltrated the army high command and managed to get top secrets about the army. Commanders who had fallen into the hands of the insurgents in several battles were told to send letters to their colleagues in the command structure and persuade them to work secretly for the insurgents. Some commanders did work for the insurgents and played a great role in the undoing of military operations. Insurgent commanders were able to warn their fighters about air strikes before jet fighters took off from airfields in the northern theatre. It was not uncommon for leaders of insurgents to get the full details of military operations days before their implementation. Such intelligence failures led to major military disasters like the fall of Afabet (March, 1988), Enda Sellase (February, 1989), Massawa and Debre Tabor (February, 1990), Karra Mishig, Meragna and Alem Ketema (June, 1990). This paper, therefore, seeks to investigate the army’s intelligence failures using untapped archival documents kept at the Ministry of National Defence in Addis Ababa and interviewing key former commanders of the army and ex-leaders of the insurgents.

Keywords: Ethiopian army, intelligence, infiltration, insurgents

Procedia PDF Downloads 289
9944 Influence of Agroforestry Trees Leafy Biomass and Nitrogen Fertilizer on Crop Growth Rate and Relative Growth Rate of Maize

Authors: A. B. Alarape, O. D. Aba

Abstract:

The use of legume tree pruning as mulch in agroforestry system is a common practice to maintain soil organic matter and improve soil fertility in the tropics. The study was conducted to determine the influence of agroforestry trees leafy biomass and nitrogen fertilizer on crop growth rate and relative growth rate of maize. The experiments were laid out as 3 x 4 x 2 factorial in a split-split plot design with three replicates. Control, biomass species (Parkia biglobosa and Albizia lebbeck) as main plots were considered, rates of nitrogen considered include (0, 40, 80, 120 kg N ha⁻¹) as sub-plots, and maize varieties (DMR-ESR-7 and 2009 EVAT) were used as sub-sub plots. Data were analyzed using descriptive and inferential statistics (ANOVA) at α = 0.05. Incorporation of leafy biomass was significant in 2015 on Relative Growth Rate (RGR), while nitrogen application was significant on Crop Growth Rate (CGR). 2009 EVAT had higher CGR in 2015 at 4-6 and 6-8 WAP. Incorporation of Albizia leaves enhanced the growth of maize than Parkia leaves. Farmers are, therefore, encouraged to use Albizia leaves as mulch to enrich their soil for maize production and most especially, in case of availability of inorganic fertilizers. Though, production of maize with biomass and application of 120 kg N ha⁻¹ will bring better growth of maize.

Keywords: agroforestry trees, fertilizer, growth, incorporation, leafy biomass

Procedia PDF Downloads 168
9943 Studying the Effectiveness of Using Narrative Animation on Students’ Understanding of Complex Scientific Concepts

Authors: Atoum Abdullah

Abstract:

The purpose of this research is to determine the extent to which computer animation and narration affect students’ understanding of complex scientific concepts and improve their exam performance, this is compared to traditional lectures that include PowerPoints with texts and static images. A mixed-method design in data collection was used, including quantitative and qualitative data. Quantitative data was collected using a pre and post-test method and a close-ended questionnaire. Qualitative data was collected through an open-ended questionnaire. A pre and posttest strategy was used to measure the level of students’ understanding with and without the use of animation. The test included multiple-choice questions to test factual knowledge, open-ended questions to test conceptual knowledge, and to label the diagram questions to test application knowledge. The results showed that students on average, performed significantly higher on the posttest as compared to the pretest on all areas of acquired knowledge. However, the increase in the posttest score with respect to the acquisition of conceptual and application knowledge was higher compared to the increase in the posttest score with respect to the acquisition of factual knowledge. This result demonstrates that animation is more beneficial when acquiring deeper, conceptual, and cognitive knowledge than when only factual knowledge is acquired.

Keywords: animation, narration, science, teaching

Procedia PDF Downloads 158
9942 Biomass and Biogas Yield of Maize as Affected by Nitrogen Rates with Varying Harvesting under Semi-Arid Condition of Pakistan

Authors: Athar Mahmood, Asad Ali

Abstract:

Management considerations including harvesting time and nitrogen application considerably influence the biomass yield, quality and biogas production. Therefore, a field study was conducted to determine the effect of various harvesting times and nitrogen rates on the biomass yield, quality and biogas yield of maize crop. This experiment was consisted of various harvesting times i.e., harvesting after 45, 55 and 65 days of sowing (DAS) and nitrogen rates i.e., 0, 100, 150 and 200 kg ha-1 respectively. The data indicated that maximum plant height, leaf area, dry matter (DM) yield, protein, acid detergent fiber, neutral detergent fiber, crude fiber contents and biogas yield were recorded 65 days after sowing while lowest was recorded 45 days after sowing. In contrary to that significantly higher chlorophyll contents were observed at 45 DAS. In case of nitrogen rates maximum plant height, leaf area, and DM yield, protein contents, ash contents, acid detergent fiber, neutral detergent fiber, crude fiber contents and chlorophyll contents were determined with nitrogen at the rate of 200 kg ha-1, while minimum was observed when no N was applied. Therefore, harvesting 65 DAS and N application @ 200 kg ha-1 can be suitable for getting the higher biomass and biogas production.

Keywords: chemical composition, fiber contents, biogas, nitrogen, harvesting time

Procedia PDF Downloads 140
9941 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 25
9940 Foliar Feeding of Methyl Jasmonate Induces Resistance in Normal and Salinity Stressed Tomato Plants, at Different Stages

Authors: Abdul Manan, Choudhary Muhammad Ayyub, Rashid Ahmad, Muhammad Adnan Bukhari

Abstract:

A project was designed to investigate the effect of foliar application of methyl jasmonate (MeJA) on physiological, biochemical and ionic attributes of salinity stressed and normal tomato plants at different stages. Salinity stress at every stage markedly reduced the net photosynthetic rate, stomatal conductance, transpiration rate, water relations parameters, protein contents, total free aminoacids and potassium (K+) contents. While, antioxidant enzymes (peroxidase (POX) and catalase (CAT)), sodium (Na+) contents and proline contents were increased substantially. Foliar application of MeJA ameliorated the drastic effects of salinity regime by recovery of physiological and biochemical attributes by enhanced production of antioxidant enzymes and osmoprotectants. The efficacy of MeJA at very initial stage (15 days after sowing (15 DAS)).proved effective for attenuating the deleterious effects of salinity stress than other stages (15 days after transplanting (15 DAT) and 30 days after transplanting (30 DAT)). To the best of our knowledge, different times of foliar feeding of MeJA was observed first time for amelioration of salinity stress in tomato plants that would be of pivotal significance for scientist to better understand the dynamics of physiological and biochemical processes in tomato.

Keywords: methyl jasmonate, osmoregulation, salinity stress, stress tolerance, tomato

Procedia PDF Downloads 294
9939 Floating Building Potential for Adaptation to Rising Sea Levels: Development of a Performance Based Building Design Framework

Authors: Livia Calcagni

Abstract:

Most of the largest cities in the world are located in areas that are vulnerable to coastal erosion and flooding, both linked to climate change and rising sea levels (RSL). Nevertheless, more and more people are moving to these vulnerable areas as cities keep growing. Architects, engineers and policy makers are called to rethink the way we live and to provide timely and adequate responses not only by investigating measures to improve the urban fabric, but also by developing strategies capable of planning change, exploring unusual and resilient frontiers of living, such as floating architecture. Since the beginning of the 21st century we have seen a dynamic growth of water-based architecture. At the same time, the shortage of land available for urban development also led to reclaim the seabed or to build floating structures. In light of these considerations, time is ripe to consider floating architecture not only as a full-fledged building typology but especially as a full-fledged adaptation solution for RSL. Currently, there is no global international legal framework for urban development on water and there is no structured performance based building design (PBBD) approach for floating architecture in most countries, let alone national regulatory systems. Thus, the research intends to identify the technological, morphological, functional, economic, managerial requirements that must be considered in a the development of the PBBD framework conceived as a meta-design tool. As it is expected that floating urban development is mostly likely to take place as extension of coastal areas, the needs and design criteria are definitely more similar to those of the urban environment than of the offshore industry. Therefor, the identification and categorization of parameters takes the urban-architectural guidelines and regulations as the starting point, taking the missing aspects, such as hydrodynamics, from the offshore and shipping regulatory frameworks. This study is carried out through an evidence-based assessment of performance guidelines and regulatory systems that are effective in different countries around the world addressing on-land and on-water architecture as well as offshore and shipping industries. It involves evidence-based research and logical argumentation methods. Overall, this paper highlights how inhabiting water is not only a viable response to the problem of RSL, thus a resilient frontier for urban development, but also a response to energy insecurity, clean water and food shortages, environmental concerns and urbanization, in line with Blue Economy principles and the Agenda 2030. Moreover, the discipline of architecture is presented as a fertile field for investigating solutions to cope with climate change and its effects on life safety and quality. Future research involves the development of a decision support system as an information tool to guide the user through the decision-making process, emphasizing the logical interaction between the different potential choices, based on the PBBD.

Keywords: adaptation measures, floating architecture, performance based building design, resilient architecture, rising sea levels

Procedia PDF Downloads 74
9938 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering

Authors: K. Umbleja, M. Ichino

Abstract:

Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.

Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis

Procedia PDF Downloads 150
9937 Preparation of Metallic Nanoparticles with the Use of Reagents of Natural Origin

Authors: Anna Drabczyk, Sonia Kudlacik-Kramarczyk, Dagmara Malina, Bozena Tyliszczak, Agnieszka Sobczak-Kupiec

Abstract:

Nowadays, nano-size materials are very popular group of materials among scientists. What is more, these materials find an application in a wide range of various areas. Therefore constantly increasing demand for nanomaterials including metallic nanoparticles such as silver of gold ones is observed. Therefore, new routes of their preparation are sought. Considering potential application of nanoparticles, it is important to select an adequate methodology of their preparation because it determines their size and shape. Among the most commonly applied methods of preparation of nanoparticles chemical and electrochemical techniques are leading. However, currently growing attention is directed into the biological or biochemical aspects of syntheses of metallic nanoparticles. This is associated with a trend of developing of new routes of preparation of given compounds according to the principles of green chemistry. These principles involve e.g. the reduction of the use of toxic compounds in the synthesis as well as the reduction of the energy demand or minimization of the generated waste. As a result, a growing popularity of the use of such components as natural plant extracts, infusions or essential oils is observed. Such natural substances may be used both as a reducing agent of metal ions and as a stabilizing agent of formed nanoparticles therefore they can replace synthetic compounds previously used for the reduction of metal ions or for the stabilization of obtained nanoparticles suspension. Methods that proceed in the presence of previously mentioned natural compounds are environmentally friendly and proceed without the application of any toxic reagents. Methodology: Presented research involves preparation of silver nanoparticles using selected plant extracts, e.g. artichoke extract. Extracts of natural origin were used as reducing and stabilizing agents at the same time. Furthermore, syntheses were carried out in the presence of additional polymeric stabilizing agent. Next, such features of obtained suspensions of nanoparticles as total antioxidant activity as well as content of phenolic compounds have been characterized. First of the mentioned studies involved the reaction with DPPH (2,2-Diphenyl-1-picrylhydrazyl) radical. The content of phenolic compounds was determined using Folin-Ciocalteu technique. Furthermore, an essential issue was also the determining of the stability of formed suspensions of nanoparticles. Conclusions: In the research it was demonstrated that metallic nanoparticles may be obtained using plant extracts or infusions as stabilizing or reducing agent. The methodology applied, i.e. a type of plant extract used during the synthesis, had an impact on the content of phenolic compounds as well as on the size and polydispersity of obtained nanoparticles. What is more, it is possible to prepare nano-size particles that will be characterized by properties desirable from the viewpoint of their potential application and such an effect may be achieved with the use of non-toxic reagents of natural origin. Furthermore, proposed methodology stays in line with the principles of green chemistry.

Keywords: green chemistry principles, metallic nanoparticles, plant extracts, stabilization of nanoparticles

Procedia PDF Downloads 113
9936 Efficient Ni(II)-Containing Layered Triple Hydroxide-Based Catalysts: Synthesis, Characterisation and Their Role in the Heck Reaction

Authors: Gabor Varga, Krisztina Karadi, Zoltan Konya, Akos Kukovecz, Pal Sipos, Istvan Palinko

Abstract:

Nickel can efficiently replace palladium in the Heck, Suzuki and Negishi reactions. This study focuses on the synthesis and catalytic application of Ni(II)-containing layered double hydroxides (LDHs) and layered triple hydroxides (LTHs). Our goals were to incorporate Ni(II) ions among the layers of LDHs or LTHs, or binding it to their surface or building it into their layers in such a way that their catalytic activities are maintained or even increased. The LDHs and LTHs were prepared by the co-precipitation method using ethylene glycol as co-solvent. In several cases, post-synthetic modifications (e.g., thermal treatment) were performed. After optimizing the synthesis conditions, the composites displayed good crystallinity and were free of byproducts. The success of the syntheses and the post-synthetic modifications was confirmed by relevant characterization methods (XRD, SEM, SEM-EDX and combined IR techniques). Catalytic activities of the produced and well-characterized solids were investigated through the Heck reaction. The composites behaved as efficient, recyclable catalysts in the Heck reaction between 4-bromoanisole and styrene. Through varying the reaction parameters, we were able to obtain acceptable conversions under mild conditions. Our study highlights the possibility of the application of Ni(II)-containing composites as efficient catalysts in coupling reactions.

Keywords: layered double hydroxide, layered triple hydroxide, heterogeneous catalysis, heck reaction

Procedia PDF Downloads 157
9935 Additive Carbon Dots Nanocrystals for Enhancement of the Efficiency of Dye-Sensitized Solar Cell in Energy Applications Technology

Authors: Getachew Kuma Watiro

Abstract:

The need for solar energy is constantly increasing and it is widely available on the earth’s surface. Photovoltaic technology is one of the most capable of all viable energy technology and is seen as a promising approach to the control era as it is readily available and has zero carbon emissions. Inexpensive and versatile solar cells have achieved the conversion efficiency and long life of dye-sensitized solar cells, improving the conversion efficiency from the sun to electricity. DSSCs have received a lot of attention for Various potential commercial uses, such as mobile devices and portable electronic devices, as well as integrated solar cell modules. The systematic reviews were used to show the critical impact of additive C-dots in the Dye-Sensitized solar cell for energy application technology. This research focuses on the following methods to synthesize nanoparticles such as facile, polyol, calcination, and hydrothermal technique. In addition to these, there are additives C-dots by the Hydrothermal method. This study deals with the progressive development of DSSC in photovoltaic technology. The applications of single and heterojunction structure technology devices were used (ZnO, NiO, SnO2, and NiO/ZnO/N719) and applied some additives C-dots (ZnO/C-dots /N719, NiO/C-dots /N719, SnO2 /C-dots /N719 and NiO/ZnO/C-dots/N719) and the effects of C-dots were reviewed. More than all, the technology of DSSC with C-dots enhances efficiency. Finally, recommendations have been made for future research on the application of DSSC with the use of these additives.

Keywords: dye-sensitized solar cells, heterojunction’s structure, carbon dot, conversion efficiency

Procedia PDF Downloads 106
9934 Comparison of Computer Software for Swept Path Analysis on Example of Special Paved Areas

Authors: Ivana Cestar, Ivica Stančerić, Saša Ahac, Vesna Dragčević, Tamara Džambas

Abstract:

On special paved areas, such as road intersections, vehicles are usually moving through horizontal curves with smaller radii and occupy considerably greater area compared to open road segments. Planning procedure of these areas is mainly an iterative process that consists of designing project elements, assembling those elements to a design project, and analyzing swept paths for the design vehicle. If applied elements do not fulfill the swept path requirements for the design vehicle, the process must be carried out again. Application of specialized computer software for swept path analysis significantly facilitates planning procedure of special paved areas. There are various software of this kind available on the global market, and each of them has different specifications. In this paper, comparison of two software commonly used in Croatia (Auto TURN and Vehicle Tracking) is presented, their advantages and disadvantages are described, and their applicability on a particular paved area is discussed. In order to reveal which one of the analyszed software is more favorable in terms of swept paths widths, which one includes input parameters that are more relevant for this kind of analysis, and which one is more suitable for the application on a certain special paved area, the analysis shown in this paper was conducted on a number of different intersection types.

Keywords: software comparison, special paved areas, swept path analysis, swept path input parameters

Procedia PDF Downloads 309
9933 Buildings Founded on Thermal Insulation Layer Subjected to Earthquake Load

Authors: David Koren, Vojko Kilar

Abstract:

The modern energy-efficient houses are often founded on a thermal insulation (TI) layer placed under the building’s RC foundation slab. The purpose of the paper is to identify the potential problems of the buildings founded on TI layer from the seismic point of view. The two main goals of the study were to assess the seismic behavior of such buildings, and to search for the critical structural parameters affecting the response of the superstructure as well as of the extruded polystyrene (XPS) layer. As a test building a multi-storeyed RC frame structure with and without the XPS layer under the foundation slab has been investigated utilizing nonlinear dynamic (time-history) and static (pushover) analyses. The structural response has been investigated with reference to the following performance parameters: i) Building’s lateral roof displacements, ii) Edge compressive and shear strains of the XPS, iii) Horizontal accelerations of the superstructure, iv) Plastic hinge patterns of the superstructure, v) Part of the foundation in compression, and vi) Deformations of the underlying soil and vertical displacements of the foundation slab (i.e. identifying the potential uplift). The results have shown that in the case of higher and stiff structures lying on firm soil the use of XPS under the foundation slab might induce amplified structural peak responses compared to the building models without XPS under the foundation slab. The analysis has revealed that the superstructure as well as the XPS response is substantially affected by the stiffness of the foundation slab.

Keywords: extruded polystyrene (XPS), foundation on thermal insulation, energy-efficient buildings, nonlinear seismic analysis, seismic response, soil–structure interaction

Procedia PDF Downloads 283
9932 A Review of Critical Framework Assessment Matrices for Data Analysis on Overheating in Buildings Impact

Authors: Martin Adlington, Boris Ceranic, Sally Shazhad

Abstract:

In an effort to reduce carbon emissions, changes in UK regulations, such as Part L Conservation of heat and power, dictates improved thermal insulation and enhanced air tightness. These changes were a direct response to the UK Government being fully committed to achieving its carbon targets under the Climate Change Act 2008. The goal is to reduce emissions by at least 80% by 2050. Factors such as climate change are likely to exacerbate the problem of overheating, as this phenomenon expects to increase the frequency of extreme heat events exemplified by stagnant air masses and successive high minimum overnight temperatures. However, climate change is not the only concern relevant to overheating, as research signifies, location, design, and occupation; construction type and layout can also play a part. Because of this growing problem, research shows the possibility of health effects on occupants of buildings could be an issue. Increases in temperature can perhaps have a direct impact on the human body’s ability to retain thermoregulation and therefore the effects of heat-related illnesses such as heat stroke, heat exhaustion, heat syncope and even death can be imminent. This review paper presents a comprehensive evaluation of the current literature on the causes and health effects of overheating in buildings and has examined the differing applied assessment approaches used to measure the concept. Firstly, an overview of the topic was presented followed by an examination of overheating research work from the last decade. These papers form the body of the article and are grouped into a framework matrix summarizing the source material identifying the differing methods of analysis of overheating. Cross case evaluation has identified systematic relationships between different variables within the matrix. Key areas focused on include, building types and country, occupants behavior, health effects, simulation tools, computational methods.

Keywords: overheating, climate change, thermal comfort, health

Procedia PDF Downloads 339
9931 Algorithm for Quantification of Pulmonary Fibrosis in Chest X-Ray Exams

Authors: Marcela de Oliveira, Guilherme Giacomini, Allan Felipe Fattori Alves, Ana Luiza Menegatti Pavan, Maria Eugenia Dela Rosa, Fernando Antonio Bacchim Neto, Diana Rodrigues de Pina

Abstract:

It is estimated that each year one death every 10 seconds (about 2 million deaths) in the world is attributed to tuberculosis (TB). Even after effective treatment, TB leaves sequelae such as, for example, pulmonary fibrosis, compromising the quality of life of patients. Evaluations of the aforementioned sequel are usually performed subjectively by radiology specialists. Subjective evaluation may indicate variations inter and intra observers. The examination of x-rays is the diagnostic imaging method most accomplished in the monitoring of patients diagnosed with TB and of least cost to the institution. The application of computational algorithms is of utmost importance to make a more objective quantification of pulmonary impairment in individuals with tuberculosis. The purpose of this research is the use of computer algorithms to quantify the pulmonary impairment pre and post-treatment of patients with pulmonary TB. The x-ray images of 10 patients with TB diagnosis confirmed by examination of sputum smears were studied. Initially the segmentation of the total lung area was performed (posteroanterior and lateral views) then targeted to the compromised region by pulmonary sequel. Through morphological operators and the application of signal noise tool, it was possible to determine the compromised lung volume. The largest difference found pre- and post-treatment was 85.85% and the smallest was 54.08%.

Keywords: algorithm, radiology, tuberculosis, x-rays exam

Procedia PDF Downloads 403
9930 Visualization of PM₂.₅ Time Series and Correlation Analysis of Cities in Bangladesh

Authors: Asif Zaman, Moinul Islam Zaber, Amin Ahsan Ali

Abstract:

In recent years of industrialization, the South Asian countries are being affected by air pollution due to a severe increase in fine particulate matter 2.5 (PM₂.₅). Among them, Bangladesh is one of the most polluting countries. In this paper, statistical analyses were conducted on the time series of PM₂.₅ from various districts in Bangladesh, mostly around Dhaka city. Research has been conducted on the dynamic interactions and relationships between PM₂.₅ concentrations in different zones. The study is conducted toward understanding the characteristics of PM₂.₅, such as spatial-temporal characterization, correlation of other contributors behind air pollution such as human activities, driving factors and environmental casualties. Clustering on the data gave an insight on the districts groups based on their AQI frequency as representative districts. Seasonality analysis on hourly and monthly frequency found higher concentration of fine particles in nighttime and winter season, respectively. Cross correlation analysis discovered a phenomenon of correlations among cities based on time-lagged series of air particle readings and visualization framework is developed for observing interaction in PM₂.₅ concentrations between cities. Significant time-lagged correlations were discovered between the PM₂.₅ time series in different city groups throughout the country by cross correlation analysis. Additionally, seasonal heatmaps depict that the pooled series correlations are less significant in warmer months, and among cities of greater geographic distance as well as time lag magnitude and direction of the best shifted correlated particulate matter time series among districts change seasonally. The geographic map visualization demonstrates spatial behaviour of air pollution among districts around Dhaka city and the significant effect of wind direction as the vital actor on correlated shifted time series. The visualization framework has multipurpose usage from gathering insight of general and seasonal air quality of Bangladesh to determining the pathway of regional transportation of air pollution.

Keywords: air quality, particles, cross correlation, seasonality

Procedia PDF Downloads 98
9929 Context-Aware Alert Method in Hajj Pilgrim Location-Based Tracking System

Authors: Syarif Hidayat

Abstract:

As millions of people with different backgrounds perform hajj every year in Saudi Arabia, it brings out several problems. Missing people is among many crucial problems need to be encountered. Some people might have had insufficient knowledge of using tracking system equipment. Other might become a victim of an accident, lose consciousness, or even died, prohibiting them to perform certain activity. For those reasons, people could not send proper SOS message. The major contribution of this paper is the application of the diverse alert method in pilgrims tracking system. It offers a simple yet robust solution to send SOS message by pilgrims during Hajj. Knowledge of context aware computing is assumed herein. This study presents four methods that could be utilized by pilgrims to send SOS. The first method is simple mobile application contains only a button. The second method is based on behavior analysis based off GPS location movement anomaly. The third method is by introducing pressing pattern to smartwatch physical button as a panic button. The fourth method is by identifying certain accelerometer pattern recognition as a sign of emergency situations. Presented method in this paper would be an important part of pilgrims tracking system. The discussion provided here includes easy to use design whilst maintaining tracking accuracy, privacy, and security of its users.

Keywords: context aware computing, emergency alert system, GPS, hajj pilgrim tracking, location-based services

Procedia PDF Downloads 204
9928 Application to Monitor the Citizens for Corona and Get Medical Aids or Assistance from Hospitals

Authors: Vathsala Kaluarachchi, Oshani Wimalarathna, Charith Vandebona, Gayani Chandrarathna, Lakmal Rupasinghe, Windhya Rankothge

Abstract:

It is the fundamental function of a monitoring system to allow users to collect and process data. A worldwide threat, the corona outbreak has wreaked havoc in Sri Lanka, and the situation has gotten out of hand. Since the epidemic, the Sri Lankan government has been unable to establish a systematic system for monitoring corona patients and providing emergency care in the event of an outbreak. Most patients have been held at home because of the high number of patients reported in the nation, but they do not yet have access to a functioning medical system. It has resulted in an increase in the number of patients who have been left untreated because of a lack of medical care. The absence of competent medical monitoring is the biggest cause of mortality for many people nowadays, according to our survey. As a result, a smartphone app for analyzing the patient's state and determining whether they should be hospitalized will be developed. Using the data supplied, we are aiming to send an alarm letter or SMS to the hospital once the system recognizes them. Since we know what those patients need and when they need it, we will put up a desktop program at the hospital to monitor their progress. Deep learning, image processing and application development, natural language processing, and blockchain management are some of the components of the research solution. The purpose of this research paper is to introduce a mechanism to connect hospitals and patients even when they are physically apart. Further data security and user-friendliness are enhanced through blockchain and NLP.

Keywords: blockchain, deep learning, NLP, monitoring system

Procedia PDF Downloads 124
9927 Corporate Sustainability Practices in Asian Countries: Pattern of Disclosure and Impact on Financial Performance

Authors: Santi Gopal Maji, R. A. J. Syngkon

Abstract:

The changing attitude of the corporate enterprises from maximizing economic benefit to corporate sustainability after the publication of Brundtland Report has attracted the interest of researchers to investigate the sustainability practices of firms and its impact on financial performance. To enrich the empirical literature in Asian context, this study examines the disclosure pattern of corporate sustainability and the influence of sustainability reporting on financial performance of firms from four Asian countries (Japan, South Korea, India and Indonesia) that are publishing sustainability report continuously from 2009 to 2016. The study has used content analysis technique based on Global Reporting Framework (3 and 3.1) reporting framework to compute the disclosure score of corporate sustainability and its components. While dichotomous coding system has been employed to compute overall quantitative disclosure score, a four-point scale has been used to access the quality of the disclosure. For analysing the disclosure pattern of corporate sustainability, box plot has been used. Further, Pearson chi-square test has been used to examine whether there is any difference in the proportion of disclosure between the countries. Finally, quantile regression model has been employed to examine the influence of corporate sustainability reporting on the difference locations of the conditional distribution of firm performance. The findings of the study indicate that Japan has occupied first position in terms of disclosure of sustainability information followed by South Korea and India. In case of Indonesia, the quality of disclosure score is considerably less as compared to other three countries. Further, the gap between the quality and quantity of disclosure score is comparatively less in Japan and South Korea as compared to India and Indonesia. The same is evident in respect of the components of sustainability. The results of quantile regression indicate that a positive impact of corporate sustainability becomes stronger at upper quantiles in case of Japan and South Korea. But the study fails to extricate any definite pattern on the impact of corporate sustainability disclosure on the financial performance of firms from Indonesia and India.

Keywords: corporate sustainability, quality and quantity of disclosure, content analysis, quantile regression, Asian countries

Procedia PDF Downloads 186
9926 Co-Development of an Assisted Manual Harvesting Tool for Peach Palm That Avoids the Harvest in Heights

Authors: Mauricio Quintero Angel, Alexander Pereira, Selene Alarcón

Abstract:

One of the elements of greatest importance in agricultural production is the harvesting; an activity associated to different occupational health risks such as harvesting in high altitudes, the transport of heavy materials and the application of excessive muscle strain that leads to muscular-bone disorders. Therefore, there is an urgent necessity to improve and validate interventions to reduce exposition and risk to harvesters. This article has the objective of describing the co-development under the ergonomic analysis framework of an assisted manual harvesting tool for peach palm oriented to reduce the risk of death and accidents as it avoid the harvest in heights. The peach palm is a palm tree that is cultivated in Colombia, Perú, Brasil, Costa Rica, among others and that reaches heights of over 20 m, with stipes covered with spines. The fruits are drupes of variable size. For the harvesting of peach palm, in Colombia farmers use the “Marota” or “Climber”, a tool in a closed X shape built in wood, that has two supports adjusted at the stipe, that elevate alternately until reaching a point high enough to grab the bunch that is brought down using a rope. An activity of high risk since it is done at a high altitude without any type of protection and safety measures. The Marota is alternated with a rod, which as variable height between 5 and 12 Meters with a harness system at one end to hold the bunch that is lowered with the whole system (bamboo bunch). The rod is used from the ground or from the Marota in height. As an alternative to traditional tools, the Bajachonta was co-developed with farmers, a tool that employs a traditional bamboo hook system with modifications, to be able to hold it with a rope that passes through a pulley. Once the bunch is hitched, the hook system is detached and this stays attached to the peduncle of the palm tree, afterwards through a pulling force being exerted towards the ground by tensioning the rope, the bunch comes loose to be taken down using a rope and the pulley system to the ground, reducing the risk and efforts in the operation. The bajachonta was evaluated in tree productive zones of Colombia, with innovative farmers, were the adoption is highly probable, with some modifications to improve its efficiency and effectiveness, keeping in mind that the farmers perceive in it an advantage in the reduction of death and accidents by not having to harvest in heights.

Keywords: assisted harvesting, ergonomics, harvesting in high altitudes, participative design, peach palm

Procedia PDF Downloads 389
9925 Application of Acid Base Accounting to Predict Post-Mining Drainage Quality in Coalfields of the Main Karoo Basin and Selected Sub-Basins, South Africa

Authors: Lindani Ncube, Baojin Zhao, Ken Liu, Helen Johanna Van Niekerk

Abstract:

Acid Base Accounting (ABA) is a tool used to assess the total amount of acidity or alkalinity contained in a specific rock sample, and is based on the total S concentration and the carbonate content of a sample. A preliminary ABA test was conducted on 14 sandstone and 5 coal samples taken from coalfields representing the Main Karoo Basin (Highveld, Vryheid and Molteno/Indwe Coalfields) and the Sub-basins (Witbank and Waterberg Coalfields). The results indicate that sandstone and coal from the Main Karoo Basin have the potential of generating Acid Mine Drainage (AMD) as they contain sufficient pyrite to generate acid, with the final pH of samples relatively low upon complete oxidation of pyrite. Sandstone from collieries representing the Main Karoo Basin are characterised by elevated contents of reactive S%. All the studied samples were characterised by an Acid Potential (AP) that is less than the Neutralizing Potential (NP) except for two samples. The results further indicate that the sandstone from the Main Karoo Basin is prone to acid generation as compared to the sandstone from the Sub-basins. However, the coal has a relatively low potential of generating any acid. The application of ABA in this study contributes to an understanding of the complexities governing water-rock interactions. In general, the coalfields from the Main Karoo Basin have much higher potential to produce AMD during mining processes than the coalfields in the Sub-basins.

Keywords: Main Karoo Basin, sub-basin, coal, sandstone, acid base accounting (ABA)

Procedia PDF Downloads 420
9924 Micro-Arc Oxidation Titanium and Post Treatment by Cold Plasma and Graft Polymerization of Acrylic Acid for Biomedical Application

Authors: Shu-Chuan Liao, Chia-Ti Chang, Ko-Shao Chen

Abstract:

Titanium and its alloy are widely used in many fields such as dentistry or orthopaedics. Due to their high strength low elastic modulus that chemical inertness and bio inert. The micro-arc oxidation used to formation a micro porous ceramic oxide layer film on Titanium surface and also to improve the resistance corrosion. For improving the biocompatibility, micro-arc oxidation surfaces bio-inert need to introduce reactive group. We introduced boundary layer by used plasma enhanced chemical vapor deposition of hexamethyldisilazane (HMDS) and organic active layer by UV light graft reactive monomer acrylic acid (AAc) therefore we can immobilize Chondroitin sulphate on surface easily by crosslinking EDC/NHS. The surface properties and composition of the modified layer were measured by scanning electron microscopy (SEM), X-ray photoelectron spectroscopy (XPS) and X-ray diffraction (XRD) and water contact angle. Water contact angle of the plasma-treated Ti surface decreases from 60° to 38°, which is an indication of hydrophilicity. The results of electrochemical polarization analysis showed that the sample plasma treated at micro-arc oxidation after plasma treatment has the best corrosion resistance. The result showed that we can immobilize chondroitin sulfate successful by a series of modification and MTT assay indicated the biocompatibility has been improved in this study.

Keywords: MAO, plasma, graft polymerization, biomedical application

Procedia PDF Downloads 245
9923 Islamic Research Methodology (I-Restmo): Eight Series Research Module with Islamic Value Concept

Authors: Noraizah Abu Bakar, Norhayati Alais, Nurdiana Azizan, Fatimah Alwi, Muhammad Zaky Razaly

Abstract:

This is a concise research module with the Islamic values concept proposed to a group of researches, potential researchers, PhD and Master Scholars to prepare themselves for their studies. The intention of designing this module is to help and guide Malaysian citizens to undergone their postgraduate’s studies. This is aligned with the 10th Malaysian plan- MyBrain 15. MyBrain 15 is a financial aid to Malaysian citizens to pursue PhD and Master programs. The program becomes one of Ministry of Education Strategic Plan to ensure by year 2013, there will be 60,000 PhD scholars in Malaysia. This module is suitable for the social science researchers; however it can be useful tool for science technology researchers such as Engineering and Information Technology disciplines too. The module consists of eight (8) series that provides a proper flow of information in doing research with the Islamic Value Application provided in each of the series. This module is designed to produce future researchers with a comprehensive knowledge of humankind and the hereafter. The uniqueness about this research module is designed based on Islamic values concept. Researchers were able to understand the proper research process and simultaneously be able to open their minds to understand Islam more closely. Application of Islamic values in each series could trigger a broader idea for researchers to examine in greater depth of knowledge related to humanities.

Keywords: Eight Series Research Module, Islamic Values concept, Teaching Methodology, Flow of Information, Epistemology of research

Procedia PDF Downloads 382
9922 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence

Authors: Mofizul Islam Awwal

Abstract:

Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.

Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence

Procedia PDF Downloads 359
9921 Development of a General Purpose Computer Programme Based on Differential Evolution Algorithm: An Application towards Predicting Elastic Properties of Pavement

Authors: Sai Sankalp Vemavarapu

Abstract:

This paper discusses the application of machine learning in the field of transportation engineering for predicting engineering properties of pavement more accurately and efficiently. Predicting the elastic properties aid us in assessing the current road conditions and taking appropriate measures to avoid any inconvenience to commuters. This improves the longevity and sustainability of the pavement layer while reducing its overall life-cycle cost. As an example, we have implemented differential evolution (DE) in the back-calculation of the elastic modulus of multi-layered pavement. The proposed DE global optimization back-calculation approach is integrated with a forward response model. This approach treats back-calculation as a global optimization problem where the cost function to be minimized is defined as the root mean square error in measured and computed deflections. The optimal solution which is elastic modulus, in this case, is searched for in the solution space by the DE algorithm. The best DE parameter combinations and the most optimum value is predicted so that the results are reproducible whenever the need arises. The algorithm’s performance in varied scenarios was analyzed by changing the input parameters. The prediction was well within the permissible error, establishing the supremacy of DE.

Keywords: cost function, differential evolution, falling weight deflectometer, genetic algorithm, global optimization, metaheuristic algorithm, multilayered pavement, pavement condition assessment, pavement layer moduli back calculation

Procedia PDF Downloads 150
9920 APP-Based Language Teaching Using Mobile Response System in the Classroom

Authors: Martha Wilson

Abstract:

With the peak of Computer-Assisted Language Learning slowly coming to pass and Mobile-Assisted Language Learning, at times, a bit lacking in the communicative department, we are now faced with a challenging question: How can we engage the interest of our digital native students and, most importantly, sustain it? As previously mentioned, our classrooms are now experiencing an influx of “digital natives” – people who have grown up using and having unlimited access to technology. While modernizing our curriculum and digitalizing our classrooms are necessary in order to accommodate this new learning style, it is a huge financial burden and a massive undertaking for language institutes. Instead, opting for a more compact, simple, yet multidimensional pedagogical tool may be the solution to the issue at hand. This paper aims to give a brief overview into an existing device referred to as Student Response Systems (SRS) and to expand on this notion to include a new prototype of response system that will be designed as a mobile application to eliminate the need for costly hardware and software. Additionally, an analysis into recent attempts by other institutes to develop the Mobile Response System (MRS) and customer reviews of the existing MRSs will be provided, as well as the lessons learned from those projects. Finally, while the new model of MRS is still in its infancy stage, this paper will discuss the implications of incorporating such an application as a tool to support and to enrich traditional techniques and also offer practical classroom applications with the existing response systems that are immediately available on the market.

Keywords: app, clickers, mobile app, mobile response system, student response system

Procedia PDF Downloads 356