Search results for: optimization techniques
190 Very Large Scale Integration Architecture of Finite Impulse Response Filter Implementation Using Retiming Technique
Authors: S. Jalaja, A. M. Vijaya Prakash
Abstract:
Recursive combination of an algorithm based on Karatsuba multiplication is exploited to design a generalized transpose and parallel Finite Impulse Response (FIR) Filter. Mid-range Karatsuba multiplication and Carry Save adder based on Karatsuba multiplication reduce time complexity for higher order multiplication implemented up to n-bit. As a result, we design modified N-tap Transpose and Parallel Symmetric FIR Filter Structure using Karatsuba algorithm. The mathematical formulation of the FFA Filter is derived. The proposed architecture involves significantly less area delay product (APD) then the existing block implementation. By adopting retiming technique, hardware cost is reduced further. The filter architecture is designed by using 90 nm technology library and is implemented by using cadence EDA Tool. The synthesized result shows better performance for different word length and block size. The design achieves switching activity reduction and low power consumption by applying with and without retiming for different combination of the circuit. The proposed structure achieves more than a half of the power reduction by adopting with and without retiming techniques compared to the earlier design structure. As a proof of the concept for block size 16 and filter length 64 for CKA method, it achieves a 51% as well as 70% less power by applying retiming technique, and for CSA method it achieves a 57% as well as 77% less power by applying retiming technique compared to the previously proposed design.Keywords: Carry save adder Karatsuba multiplication, mid-range Karatsuba multiplication, modified FFA, transposed filter, retiming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 910189 Detection of Transgenes in Cotton (Gossypium hirsutum L.) by Using Biotechnology/Molecular Biological Techniques
Authors: Ahmad Ali Shahid, Muhammad Shakil Shaukat, Kamran Shehzad Bajwa, Abdul Qayyum Rao, Tayyab Husnain
Abstract:
Agriculture is the backbone of economy of Pakistan and cotton is the major agricultural export and supreme source of raw fiber for our textile industry. To combat severe problems of insect and weed, combination of three genes namely Cry1Ac, Cry2A and EPSPS genes was transferred in locally cultivated cotton variety MNH-786 with the use of Agrobacterium mediated genetic transformation. The present study focused on the molecular screening of transgenic cotton plants at T3 generation in order to confirm integration and expression of all three genes (Cry1Ac, Cry2A and EPSP synthase) into the cotton genome. Initially, glyphosate spray assay was used for screening of transgenic cotton plants containing EPSP synthase gene at T3 generation. Transgenic cotton plants which were healthy and showed no damage on leaves were selected after 07 days of spray. For molecular analysis of transgenic cotton plants in the laboratory, the genomic DNA of these transgenic cotton plants were isolated and subjected to amplification of the three genes. Thus, seventeen out of twenty (Cry1Ac gene), ten out of twenty (Cry2A gene) and all twenty (EPSP synthase gene) were produced positive amplification. On the base of PCR amplification, ten transgenic plant samples were subjected to protein expression analysis through ELISA. The results showed that eight out of ten plants were actively expressing the three transgenes. Real-time PCR was also done to quantify the mRNA expression levels of Cry1Ac and EPSP synthase gene. Finally, eight plants were confirmed for the presence and active expression of all three genes at T3 generation.
Keywords: Agriculture, Cotton, Transformation, Cry Genes, ELISA and PCR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3138188 A Case Study on Appearance Based Feature Extraction Techniques and Their Susceptibility to Image Degradations for the Task of Face Recognition
Authors: Vitomir Struc, Nikola Pavesic
Abstract:
Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.
Keywords: Biometrics, face recognition, appearance based methods, image degradations, the XM2VTS database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2284187 Groundwater Potential Zone Identification in Unconsolidated Aquifer Using Geophysical Techniques around Tarbela Ghazi, District Haripur, Pakistan
Authors: Syed Muzyan Shahzad, Liu Jianxin, Asim Shahzad, Muhammad Sharjeel Raza, Sun Ya, Fanidi Meryem
Abstract:
Electrical resistivity investigation was conducted in vicinity of Tarbela Ghazi, in order to study the subsurface layer with a view of determining the depth to the aquifer and thickness of groundwater potential zones. Vertical Electrical Sounding (VES) using Schlumberger array was carried out at 16 VES stations. Well logging data at four tube wells have been used to mark the super saturated zones with great discharge rate. The present paper shows a geoelectrical identification of the lithology and an estimate of the relationship between the resistivity and Dar Zarrouk parameters (transverse unit resistance and longitudinal unit conductance). The VES results revealed both homogeneous and heterogeneous nature of the subsurface strata. Aquifer is unconfined to confine in nature, and at few locations though perched aquifer has been identified, groundwater potential zones are developed in unconsolidated deposits layers and more than seven geo-electric layers are observed at some VES locations. Saturated zones thickness ranges from 5 m to 150 m, whereas at few area aquifer is beyond 150 m thick. The average anisotropy, transvers resistance and longitudinal conductance values are 0.86 %, 35750.9821 Ω.m2, 0.729 Siemens, respectively. The transverse unit resistance values fluctuate all over the aquifer system, whereas below at particular depth high values are observed, that significantly associated with the high transmissivity zones. The groundwater quality in all analyzed samples is below permissible limit according to World Health Standard (WHO).
Keywords: Geoelectric layers, Dar Zarrouk parameters, Aquifer, Electro-stratigraphic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 815186 Frequency Response of Complex Systems with Localized Nonlinearities
Authors: E. Menga, S. Hernandez
Abstract:
Finite Element Models (FEMs) are widely used in order to study and predict the dynamic properties of structures and usually, the prediction can be obtained with much more accuracy in the case of a single component than in the case of assemblies. Especially for structural dynamics studies, in the low and middle frequency range, most complex FEMs can be seen as assemblies made by linear components joined together at interfaces. From a modelling and computational point of view, these types of joints can be seen as localized sources of stiffness and damping and can be modelled as lumped spring/damper elements, most of time, characterized by nonlinear constitutive laws. On the other side, most of FE programs are able to run nonlinear analysis in time-domain. They treat the whole structure as nonlinear, even if there is one nonlinear degree of freedom (DOF) out of thousands of linear ones, making the analysis unnecessarily expensive from a computational point of view. In this work, a methodology in order to obtain the nonlinear frequency response of structures, whose nonlinearities can be considered as localized sources, is presented. The work extends the well-known Structural Dynamic Modification Method (SDMM) to a nonlinear set of modifications, and allows getting the Nonlinear Frequency Response Functions (NLFRFs), through an ‘updating’ process of the Linear Frequency Response Functions (LFRFs). A brief summary of the analytical concepts is given, starting from the linear formulation and understanding what the implications of the nonlinear one, are. The response of the system is formulated in both: time and frequency domain. First the Modal Database is extracted and the linear response is calculated. Secondly the nonlinear response is obtained thru the NL SDMM, by updating the underlying linear behavior of the system. The methodology, implemented in MATLAB, has been successfully applied to estimate the nonlinear frequency response of two systems. The first one is a two DOFs spring-mass-damper system, and the second example takes into account a full aircraft FE Model. In spite of the different levels of complexity, both examples show the reliability and effectiveness of the method. The results highlight a feasible and robust procedure, which allows a quick estimation of the effect of localized nonlinearities on the dynamic behavior. The method is particularly powerful when most of the FE Model can be considered as acting linearly and the nonlinear behavior is restricted to few degrees of freedom. The procedure is very attractive from a computational point of view because the FEM needs to be run just once, which allows faster nonlinear sensitivity analysis and easier implementation of optimization procedures for the calibration of nonlinear models.Keywords: Frequency response, nonlinear dynamics, structural dynamic modification, softening effect, rubber.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1313185 Control of Vibrations in Flexible Smart Structures using Fast Output Sampling Feedback Technique
Authors: T.C. Manjunath, B. Bandyopadhyay
Abstract:
This paper features the modeling and design of a Fast Output Sampling (FOS) Feedback control technique for the Active Vibration Control (AVC) of a smart flexible aluminium cantilever beam for a Single Input Single Output (SISO) case. Controllers are designed for the beam by bonding patches of piezoelectric layer as sensor / actuator to the master structure at different locations along the length of the beam by retaining the first 2 dominant vibratory modes. The entire structure is modeled in state space form using the concept of piezoelectric theory, Euler-Bernoulli beam theory, Finite Element Method (FEM) and the state space techniques by dividing the structure into 3, 4, 5 finite elements, thus giving rise to three types of systems, viz., system 1 (beam divided into 3 finite elements), system 2 (4 finite elements), system 3 (5 finite elements). The effect of placing the sensor / actuator at various locations along the length of the beam for all the 3 types of systems considered is observed and the conclusions are drawn for the best performance and for the smallest magnitude of the control input required to control the vibrations of the beam. Simulations are performed in MATLAB. The open loop responses, closed loop responses and the tip displacements with and without the controller are obtained and the performance of the proposed smart system is evaluated for vibration control.Keywords: Smart structure, Finite element method, State spacemodel, Euler-Bernoulli theory, SISO model, Fast output sampling, Vibration control, LMI
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820184 Metal Inert Gas Welding-Based-Shaped Metal Deposition in Additive Layered Manufacturing: A Review
Authors: Adnan A. Ugla, Hassan J. Khaudair, Ahmed R. J. Almusawi
Abstract:
Shaped Metal Deposition (SMD) in additive layered manufacturing technique is a promising alternative to traditional manufacturing used for manufacturing large, expensive metal components with complex geometry in addition to producing free structures by building materials in a layer by layer technique. The present paper is a comprehensive review of the literature and the latest rapid manufacturing technologies of the SMD technique. The aim of this paper is to comprehensively review the most prominent facts that researchers have dealt with in the SMD techniques especially those associated with the cold wire feed. The intent of this study is to review the literature presented on metal deposition processes and their classifications, including SMD process using Wire + Arc Additive Manufacturing (WAAM) which divides into wire + tungsten inert gas (TIG), metal inert gas (MIG), or plasma. This literary research presented covers extensive details on bead geometry, process parameters and heat input or arc energy resulting from the deposition process in both cases MIG and Tandem-MIG in SMD process. Furthermore, SMD may be done using Single Wire-MIG (SW-MIG) welding and SMD using Double Wire-MIG (DW-MIG) welding. The present review shows that the method of deposition of metals when using the DW-MIG process can be considered a distinctive and low-cost method to produce large metal components due to high deposition rates as well as reduce the input of high temperature generated during deposition and reduce the distortions. However, the accuracy and surface finish of the MIG-SMD are less as compared to electron and laser beam.
Keywords: Shaped metal deposition, additive manufacturing, double-wire feed, cold feed wire.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395183 A CFD Study of Turbulent Convective Heat Transfer Enhancement in Circular Pipeflow
Authors: Perumal Kumar, Rajamohan Ganesan
Abstract:
Addition of milli or micro sized particles to the heat transfer fluid is one of the many techniques employed for improving heat transfer rate. Though this looks simple, this method has practical problems such as high pressure loss, clogging and erosion of the material of construction. These problems can be overcome by using nanofluids, which is a dispersion of nanosized particles in a base fluid. Nanoparticles increase the thermal conductivity of the base fluid manifold which in turn increases the heat transfer rate. Nanoparticles also increase the viscosity of the basefluid resulting in higher pressure drop for the nanofluid compared to the base fluid. So it is imperative that the Reynolds number (Re) and the volume fraction have to be optimum for better thermal hydraulic effectiveness. In this work, the heat transfer enhancement using aluminium oxide nanofluid using low and high volume fraction nanofluids in turbulent pipe flow with constant wall temperature has been studied by computational fluid dynamic modeling of the nanofluid flow adopting the single phase approach. Nanofluid, up till a volume fraction of 1% is found to be an effective heat transfer enhancement technique. The Nusselt number (Nu) and friction factor predictions for the low volume fractions (i.e. 0.02%, 0.1 and 0.5%) agree very well with the experimental values of Sundar and Sharma (2010). While, predictions for the high volume fraction nanofluids (i.e. 1%, 4% and 6%) are found to have reasonable agreement with both experimental and numerical results available in the literature. So the computationally inexpensive single phase approach can be used for heat transfer and pressure drop prediction of new nanofluids.Keywords: Heat transfer intensification, nanofluid, CFD, friction factor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2875182 Security Analysis of Password Hardened Multimodal Biometric Fuzzy Vault
Authors: V. S. Meenakshi, G. Padmavathi
Abstract:
Biometric techniques are gaining importance for personal authentication and identification as compared to the traditional authentication methods. Biometric templates are vulnerable to variety of attacks due to their inherent nature. When a person-s biometric is compromised his identity is lost. In contrast to password, biometric is not revocable. Therefore, providing security to the stored biometric template is very crucial. Crypto biometric systems are authentication systems, which blends the idea of cryptography and biometrics. Fuzzy vault is a proven crypto biometric construct which is used to secure the biometric templates. However fuzzy vault suffer from certain limitations like nonrevocability, cross matching. Security of the fuzzy vault is affected by the non-uniform nature of the biometric data. Fuzzy vault when hardened with password overcomes these limitations. Password provides an additional layer of security and enhances user privacy. Retina has certain advantages over other biometric traits. Retinal scans are used in high-end security applications like access control to areas or rooms in military installations, power plants, and other high risk security areas. This work applies the idea of fuzzy vault for retinal biometric template. Multimodal biometric system performance is well compared to single modal biometric systems. The proposed multi modal biometric fuzzy vault includes combined feature points from retina and fingerprint. The combined vault is hardened with user password for achieving high level of security. The security of the combined vault is measured using min-entropy. The proposed password hardened multi biometric fuzzy vault is robust towards stored biometric template attacks.Keywords: Biometric Template Security, Crypto Biometric Systems, Hardening Fuzzy Vault, Min-Entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159181 The Shifting Urban Role of Buildings’ Facades: A Diachronic Analysis of El Korba
Authors: Virginia Bassily, Sherif Goubran
Abstract:
In heritage conservation and revival, much of the focus is placed on the techniques and methods to preserve, restore, and revive heritage structures and locations. However, more attention needs to be drawn to how deterioration happens and its effect on the area’s character and socio-economic status. To this end, this research aims to examine the decline and its effect in the El Korba area in Heliopolis, Cairo, Egypt. El Korba was designed with a unique architectural character to stimulate social and economic life. However, the area has been on a path of physical deterioration that is corroding the social life on its streets. This research uses diachronic analysis in Ibrahim El-Lakkani Boulevard of El Korba based on a previously developed framework that connects buildings’ architectural features to the degree of social interaction in the street to document the changes that the building deterioration could have caused. Architectural features of the street level during both the original state (1906) and the current state (2021) are broken down and categorized in those six parameters to understand their decline or improvement over time. We find that the parameters that have decreased over the years and caused the deterioration are complexity and architectural character, permeability, territoriality and personalization, and physical comfort. Based on these findings, revival projects can focus on physical parameters that create synergistic benefits by preserving and renewing heritage locations and revitalizing their socio-economic potential.
Keywords: Architectural character, heritage building conservation, enclosure, ground-floor use, El Korba, visual and physical permeability, personalization, physical comfort, social life, territoriality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 488180 Advanced Stochastic Models for Partially Developed Speckle
Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije
Abstract:
Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744179 Effect of Laser Power and Powder Flow Rate on Properties of Laser Metal Deposited Ti6Al4V
Authors: Mukul Shukla, Rasheedat M. Mahamood, Esther T. Akinlabi, Sisa. Pityana
Abstract:
Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new part directly from 3 Dimensional Computer Aided Design (3D CAD) model, building new part on the existing old component and repairing an existing high valued component parts that would have been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing parameters and studying many parameters at the same time makes it further complex to understand. In this study, the effect of laser power and powder flow rate on physical properties (deposition height and deposition width), metallurgical property (microstructure) and mechanical (microhardness) properties on laser deposited most widely used aerospace alloy are studied. Also, because the Ti6Al4V is very expensive, and LMD is capable of reducing buy-to-fly ratio of aerospace parts, the material utilization efficiency is also studied. Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and scanning speed constant at 2 l/min and 0.005 m/s respectively. The deposition height / width are found to increase with increase in laser power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces material utilization. The results are presented and fully discussed.Keywords: Laser Metal Deposition, Material Efficiency, Microstructure, Ti6Al4V.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3629178 Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation
Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour
Abstract:
Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.
Keywords: Answer processing, answer validation, classification, question answering, query reformulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2847177 Sustainability Impact Assessment of Construction Ecology to Engineering Systems and Climate Change
Authors: Moustafa Osman Mohammed
Abstract:
Construction industry, as one of the main contributor in depletion of natural resources, influences climate change. This paper discusses incremental and evolutionary development of the proposed models for optimization of a life-cycle analysis to explicit strategy for evaluation systems. The main categories are virtually irresistible for introducing uncertainties, uptake composite structure model (CSM) as environmental management systems (EMSs) in a practice science of evaluation small and medium-sized enterprises (SMEs). The model simplified complex systems to reflect nature systems’ input, output and outcomes mode influence “framework measures” and give a maximum likelihood estimation of how elements are simulated over the composite structure. The traditional knowledge of modeling is based on physical dynamic and static patterns regarding parameters influence environment. It unified methods to demonstrate how construction systems ecology interrelated from management prospective in procedure reflects the effect of the effects of engineering systems to ecology as ultimately unified technologies in extensive range beyond constructions impact so as, - energy systems. Sustainability broadens socioeconomic parameters to practice science that meets recovery performance, engineering reflects the generic control of protective systems. When the environmental model employed properly, management decision process in governments or corporations could address policy for accomplishment strategic plans precisely. The management and engineering limitation focuses on autocatalytic control as a close cellular system to naturally balance anthropogenic insertions or aggregation structure systems to pound equilibrium as steady stable conditions. Thereby, construction systems ecology incorporates engineering and management scheme, as a midpoint stage between biotic and abiotic components to predict constructions impact. The later outcomes’ theory of environmental obligation suggests either a procedures of method or technique that is achieved in sustainability impact of construction system ecology (SICSE), as a relative mitigation measure of deviation control, ultimately.
Keywords: Sustainability, constructions ecology, composite structure model, design structure matrix, environmental impact assessment, life cycle analysis, climate change.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434176 Screen of MicroRNA Targets in Zebrafish Using Heterogeneous Data Sources: A Case Study for Dre-miR-10 and Dre-miR-196
Authors: Yanju Zhang, Joost M. Woltering, Fons J. Verbeek
Abstract:
It has been established that microRNAs (miRNAs) play an important role in gene expression by post-transcriptional regulation of messengerRNAs (mRNAs). However, the precise relationships between microRNAs and their target genes in sense of numbers, types and biological relevance remain largely unclear. Dissecting the miRNA-target relationships will render more insights for miRNA targets identification and validation therefore promote the understanding of miRNA function. In miRBase, miRanda is the key algorithm used for target prediction for Zebrafish. This algorithm is high-throughput but brings lots of false positives (noise). Since validation of a large scale of targets through laboratory experiments is very time consuming, several computational methods for miRNA targets validation should be developed. In this paper, we present an integrative method to investigate several aspects of the relationships between miRNAs and their targets with the final purpose of extracting high confident targets from miRanda predicted targets pool. This is achieved by using the techniques ranging from statistical tests to clustering and association rules. Our research focuses on Zebrafish. It was found that validated targets do not necessarily associate with the highest sequence matching. Besides, for some miRNA families, the frequency of their predicted targets is significantly higher in the genomic region nearby their own physical location. Finally, in a case study of dre-miR-10 and dre-miR-196, it was found that the predicted target genes hoxd13a, hoxd11a, hoxd10a and hoxc4a of dre-miR- 10 while hoxa9a, hoxc8a and hoxa13a of dre-miR-196 have similar characteristics as validated target genes and therefore represent high confidence target candidates.Keywords: MicroRNA targets validation, microRNA-target relationships, dre-miR-10, dre-miR-196.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991175 Laplace Transformation on Ordered Linear Space of Generalized Functions
Authors: K. V. Geetha, N. R. Mangalambal
Abstract:
Aim. We have introduced the notion of order to multinormed spaces and countable union spaces and their duals. The topology of bounded convergence is assigned to the dual spaces. The aim of this paper is to develop the theory of ordered topological linear spaces La,b, L(w, z), the dual spaces of ordered multinormed spaces La,b, ordered countable union spaces L(w, z), with the topology of bounded convergence assigned to the dual spaces. We apply Laplace transformation to the ordered linear space of Laplace transformable generalized functions. We ultimately aim at finding solutions to nonhomogeneous nth order linear differential equations with constant coefficients in terms of generalized functions and comparing different solutions evolved out of different initial conditions. Method. The above aim is achieved by • Defining the spaces La,b, L(w, z). • Assigning an order relation on these spaces by identifying a positive cone on them and studying the properties of the cone. • Defining an order relation on the dual spaces La,b, L(w, z) of La,b, L(w, z) and assigning a topology to these dual spaces which makes the order dual and the topological dual the same. • Defining the adjoint of a continuous map on these spaces and studying its behaviour when the topology of bounded convergence is assigned to the dual spaces. • Applying the two-sided Laplace Transformation on the ordered linear space of generalized functions W and studying some properties of the transformation which are used in solving differential equations. Result. The above techniques are applied to solve non-homogeneous n-th order linear differential equations with constant coefficients in terms of generalized functions and to compare different solutions of the differential equation.Keywords: Laplace transformable generalized function, positive cone, topology of bounded convergence
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234174 Direction to Manage OTOP Entrepreneurship Based on Local Wisdom
Authors: Witthaya Mekhum
Abstract:
The OTOP Entrepreneurship that used to create substantial source of income for local Thai communities are now in a stage of exigent matters that required assistances from public sectors due to over Entrepreneurship of duplicative ideas, unable to adjust costs and prices, lack of innovation, and inadequate of quality control. Moreover, there is a repetitive problem of middlemen who constantly corner the OTOP market. Local OTOP producers become easy preys since they do not know how to add more values, how to create and maintain their own brand name, and how to create proper packaging and labeling. The suggested solutions to local OTOP producers are to adopt modern management techniques, to find knowhow to add more values to products and to unravel other marketing problems. The objectives of this research are to study the prevalent OTOP products management and to discover direction to manage OTOP products to enhance the effectiveness of OTOP Entrepreneurship in Nonthaburi Province, Thailand. There were 113 participants in this study. The research tools can be divided into two parts: First part is done by questionnaire to find responses of the prevalent OTOP Entrepreneurship management. Second part is the use of focus group which is conducted to encapsulate ideas and local wisdom. Data analysis is performed by using frequency, percentage, mean, and standard deviation as well as the synthesis of several small group discussions. The findings reveal that 1) Business Resources: the quality of product is most important and the marketing of product is least important. 2) Business Management: Leadership is most important and raw material planning is least important. 3) Business Readiness: Communication is most important and packaging is least important. 4) Support from public sector: Certified from the government is most important and source of raw material is the least important.Keywords: Management, OTOP Entrepreneurship, Local Wisdom
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1941173 A Corporate Social Responsibility Project to Improve the Democratization of Scientific Education in Brazil
Authors: Denise Levy
Abstract:
Nuclear technology is part of our everyday life and its beneficial applications help to improve the quality of our lives. Nevertheless, in Brazil, most often the media and social networks tend to associate radiation to nuclear weapons and major accidents, and there is still great misunderstanding about the peaceful applications of nuclear science. The Educational Portal Radioatividades (Radioactivities) is a corporate social responsibility initiative that takes advantage of the growing impact of Internet to offer high quality scientific information for teachers and students throughout Brazil. This web-based initiative focusses on the positive applications of nuclear technology, presenting the several contributions of ionizing radiation in different contexts, such as nuclear medicine, agriculture techniques, food safety and electric power generation, proving nuclear technology as part of modern life and a must to improve the quality of our lifestyle. This educational project aims to contribute for democratization of scientific education and social inclusion, approaching society to scientific knowledge, promoting critical thinking and inspiring further reflections. The website offers a wide variety of ludic activities such as curiosities, interactive exercises and short courses. Moreover, teachers are offered free web-based material with full instructions to be developed in class. Since year 2013, the project has been developed and improved according to a comprehensive study about the realistic scenario of ICTs infrastructure in Brazilian schools and in full compliance with the best e-learning national and international recommendations.
Keywords: Information and communication technologies, nuclear technology, science communication, society and education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1216172 The Latency-Amplitude Binomial of Waves Resulting from the Application of Evoked Potentials for the Diagnosis of Dyscalculia
Authors: Maria Isabel Garcia-Planas, Maria Victoria Garcia-Camba
Abstract:
Recent advances in cognitive neuroscience have allowed a step forward in perceiving the processes involved in learning from the point of view of acquiring new information or the modification of existing mental content. The evoked potentials technique reveals how basic brain processes interact to achieve adequate and flexible behaviours. The objective of this work, using evoked potentials, is to study if it is possible to distinguish if a patient suffers a specific type of learning disorder to decide the possible therapies to follow. The methodology used in this work is to analyze the dynamics of different brain areas during a cognitive activity to find the relationships between the other areas analyzed to understand the functioning of neural networks better. Also, the latest advances in neuroscience have revealed the exis-tence of different brain activity in the learning process that can be highlighted through the use of non-invasive, innocuous, low-cost and easy-access techniques such as, among others, the evoked potentials that can help to detect early possible neurodevelopmental difficulties for their subsequent assessment and therapy. From the study of the amplitudes and latencies of the evoked potentials, it is possible to detect brain alterations in the learning process, specifically in dyscalculia, to achieve specific corrective measures for the application of personalized psycho-pedagogical plans that allow obtaining an optimal integral development of the affected people.
Keywords: dyscalculia, neurodevelopment, evoked potentials, learning disabilities, neural networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 599171 Influence of Strengthening with Perforated Steel Plates on the Behavior of Infill Walls and RC Frame
Authors: Eray Ozbek, Ilker Kalkan, S. Oguzhan Akbas, Sabahattin Aykac
Abstract:
The contribution of the infill walls to the overall earthquake response of a structure is limited and this contribution is generally ignored in the analyses. Strengthening of the infill walls through different techniques has been and is being studied extensively in the literature to increase this limited contribution and the ductilities and energy absorption capacities of the infill walls to create non-structural components where the earthquake-induced energy can be absorbed without damaging the bearing components of the structural frame. The present paper summarizes an extensive research project dedicated to investigate the effects of strengthening the brick infill walls of a reinforced concrete (RC) frame on its lateral earthquake response. Perforated steel plates were used in strengthening due to several reasons, including the ductility and high deformation capacity of these plates, the fire resistant, recyclable and non-cancerogenic nature of mild steel, and the ease of installation and removal of the plates to the wall with the help of anchor bolts only. Furthermore, epoxy, which increases the cost and amount of labor of the strengthening process, is not needed in this technique. The individual behavior of the strengthened walls under monotonic diagonal and lateral reversed cyclic loading was investigated within the scope of the study. Upon achieving brilliant results, RC frames with strengthened infill walls were tested and are being tested to examine the influence of this strengthening technique on the overall behavior of the RC frames. Tests on the wall and frame specimens indicated that the perforated steel plates contribute to the lateral strength, rigidity, ductility and energy absorption capacity of the wall and the infilled frame to a major extent.
Keywords: Infill wall, Strengthening, External plate, Earthquake Behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2420170 Feature Point Reduction for Video Stabilization
Authors: Theerawat Songyot, Tham Manjing, Bunyarit Uyyanonvara, Chanjira Sinthanayothin
Abstract:
Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Keywords: background object tracking, feature point reduction, low cost tracking, video stabilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767169 Multiple Targets Classification and Fuzzy Logic Decision Fusion in Wireless Sensor Networks
Authors: Ahmad Aljaafreh
Abstract:
This paper proposes a hierarchical hidden Markov model (HHMM) to model the detection of M vehicles in a wireless sensor network (WSN). The HHMM model contains an extra level of hidden Markov model to model the temporal transitions of each state of the first HMM. By modeling the temporal transitions, only those hypothesis with nonzero transition probabilities needs to be tested. Thus, this method efficiently reduces the computation load, which is preferable in WSN applications.This paper integrates several techniques to optimize the detection performance. The output of the states of the first HMM is modeled as Gaussian Mixture Model (GMM), where the number of states and the number of Gaussians are experimentally determined, while the other parameters are estimated using Expectation Maximization (EM). HHMM is used to model the sequence of the local decisions which are based on multiple hypothesis testing with maximum likelihood approach. The states in the HHMM represent various combinations of vehicles of different types. Due to the statistical advantages of multisensor data fusion, we propose a heuristic based on fuzzy weighted majority voting to enhance cooperative classification of moving vehicles within a region that is monitored by a wireless sensor network. A fuzzy inference system weighs each local decision based on the signal to noise ratio of the acoustic signal for target detection and the signal to noise ratio of the radio signal for sensor communication. The spatial correlation among the observations of neighboring sensor nodes is efficiently utilized as well as the temporal correlation. Simulation results demonstrate the efficiency of this scheme.
Keywords: Classification, decision fusion, fuzzy logic, hidden Markov model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6249168 Stereo Motion Tracking
Authors: Yudhajit Datta, Jonathan Bandi, Ankit Sethia, Hamsi Iyer
Abstract:
Motion Tracking and Stereo Vision are complicated, albeit well-understood problems in computer vision. Existing softwares that combine the two approaches to perform stereo motion tracking typically employ complicated and computationally expensive procedures. The purpose of this study is to create a simple and effective solution capable of combining the two approaches. The study aims to explore a strategy to combine the two techniques of two-dimensional motion tracking using Kalman Filter; and depth detection of object using Stereo Vision. In conventional approaches objects in the scene of interest are observed using a single camera. However for Stereo Motion Tracking; the scene of interest is observed using video feeds from two calibrated cameras. Using two simultaneous measurements from the two cameras a calculation for the depth of the object from the plane containing the cameras is made. The approach attempts to capture the entire three-dimensional spatial information of each object at the scene and represent it through a software estimator object. In discrete intervals, the estimator tracks object motion in the plane parallel to plane containing cameras and updates the perpendicular distance value of the object from the plane containing the cameras as depth. The ability to efficiently track the motion of objects in three-dimensional space using a simplified approach could prove to be an indispensable tool in a variety of surveillance scenarios. The approach may find application from high security surveillance scenes such as premises of bank vaults, prisons or other detention facilities; to low cost applications in supermarkets and car parking lots.
Keywords: Kalman Filter, Stereo Vision, Motion Tracking, Matlab, Object Tracking, Camera Calibration, Computer Vision System Toolbox.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2822167 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach
Authors: Aladdin Al-Tarawneh
Abstract:
The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.
Keywords: Quran translation, hybrid approach, domestication, foreignisation, hybrid model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1189166 Space Telemetry Anomaly Detection Based on Statistical PCA Algorithm
Authors: B. Nassar, W. Hussein, M. Mokhtar
Abstract:
The critical concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission, but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the problem above coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions, and the results show that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: Space telemetry monitoring, multivariate analysis, PCA algorithm, space operations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2062165 Importance of Risk Assessment in Managers´ Decision-Making Process
Authors: Mária Hudáková, Vladimír Míka, Katarína Hollá
Abstract:
Making decisions is the core of management and a result of conscious activities which is under way in a particular environment and concrete conditions. The managers decide about the goals, procedures and about the methods how to respond to the changes and to the problems which developed. Their decisions affect the effectiveness, quality, economy and the overall successfulness in every organisation. In spite of this fact, they do not pay sufficient attention to the individual steps of the decision-making process. They emphasise more how to cope with the individual methods and techniques of making decisions and forget about the way how to cope with analysing the problem or assessing the individual solution variants. In many cases, the underestimating of the analytical phase can lead to an incorrect assessment of the problem and this can then negatively influence its further solution. Based on our analysis of the theoretical solutions by individual authors who are dealing with this area and the realised research in Slovakia and also abroad we can recognise an insufficient interest of the managers to assess the risks in the decision-making process. The goal of this paper is to assess the risks in the managers´ decision-making process relating to the conditions of the environment, to the subject’s activity (the manager’s personality), to the insufficient assessment of individual variants for solving the problems but also to situations when the arisen problem is not solved. The benefit of this paper is the effort to increase the need of the managers to deal with the risks during the decision-making process. It is important for every manager to assess the risks in his/her decision-making process and to make efforts to take such decisions which reflect the basic conditions, states and development of the environment in the best way and especially for the managers´ decisions to contribute to achieving the determined goals of the organisation as effectively as possible.
Keywords: Risk, decision-making, manager, process, analysis, source of risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799164 The Impact of HIV/AIDS on Micro-enterprise Development in Kenya: A Study of Obunga Slum in Kisumu
Authors: C. A. Oloo, C. Ojwang
Abstract:
The performances of small and medium enterprises have stagnated in the last two decades. This has mainly been due to the emergence of HIV / Aids. The disease has had a detrimental effect on the general economy of the country leading to morbidity and mortality of the Kenyan workforce in their primary age. The present study sought to establish the economic impact of HIV / Aids on the micro-enterprise development in Obunga slum – Kisumu, in terms of production loss, increasing labor related cost and to establish possible strategies to address the impact of HIV / Aids on microenterprises. The study was necessitated by the observation that most micro-enterprises in the slum are facing severe economic and social crisis due to the impact of HIV / Aids, they get depleted and close down within a short time due to death of skilled and experience workforce. The study was carried out between June 2008 and June 2009 in Obunga slum. Data was subjected to computer aided statistical analysis that included descriptive statistic, chi-squared and ANOVA techniques. Chi-squared analysis on the micro-enterprise owners opinion on the impact of HIV / Aids on depletion of microenterprise compared to other diseases indicated high levels of the negative effects of the disease at significance levels of P<0.01. Analysis of variance on the impact of HIV / Aids on the performance and productivity of micro-enterprises also indicated a negative effect on the general performance of micro-enterprise at significance levels of P<0.01. Therefore reducing the negative impacts of HIV/Aids on micro-enterprise development, there is need to improve the socioeconomic environment, mobilize donors and stake holders in training and funding, and review the current strategies for addressing the disease. Further conclusive research should also be conducted on a bigger scale.Keywords: Entrepreneurship, HIV-AIDS, Micro-enterprise, Poverty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2404163 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method
Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay
Abstract:
This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.Keywords: Agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1072162 Modeling and FOS Feedback Based Control of SISO Intelligent Structures with Embedded Shear Sensors and Actuators
Authors: T. C. Manjunath, B. Bandyopadhyay
Abstract:
Active vibration control is an important problem in structures. The objective of active vibration control is to reduce the vibrations of a system by automatic modification of the system-s structural response. In this paper, the modeling and design of a fast output sampling feedback controller for a smart flexible beam system embedded with shear sensors and actuators for SISO system using Timoshenko beam theory is proposed. FEM theory, Timoshenko beam theory and the state space techniques are used to model the aluminum cantilever beam. For the SISO case, the beam is divided into 5 finite elements and the control actuator is placed at finite element position 1, whereas the sensor is varied from position 2 to 5, i.e., from the nearby fixed end to the free end. Controllers are designed using FOS method and the performance of the designed FOS controller is evaluated for vibration control for 4 SISO models of the same plant. The effect of placing the sensor at different locations on the beam is observed and the performance of the controller is evaluated for vibration control. Some of the limitations of the Euler-Bernoulli theory such as the neglection of shear and axial displacement are being considered here, thus giving rise to an accurate beam model. Embedded shear sensors and actuators have been considered in this paper instead of the surface mounted sensors and actuators for vibration suppression because of lot of advantages. In controlling the vibration modes, the first three dominant modes of vibration of the system are considered.Keywords: Smart structure, Timoshenko beam theory, Fast output sampling feedback control, Finite Element Method, State space model, SISO, Vibration control, LMI
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787161 The Pedagogical Integration of Digital Technologies in Initial Teacher Training
Authors: Vânia Graça, Paula Quadros-Flores, Altina Ramos
Abstract:
The use of Digital Technologies in teaching and learning processes is currently a reality, namely in initial teacher training. This study aims at knowing the digital reality of students in initial teacher training in order to improve training in the educational use of ICT and to promote digital technology integration strategies in an educational context. It is part of the IFITIC Project "Innovate with ICT in Initial Teacher Training to Promote Methodological Renewal in Pre-school Education and in the 1st and 2nd Basic Education Cycle" which involves the School of Education, Polytechnic of Porto and Institute of Education, University of Minho. The Project aims at rethinking educational practice with ICT in the initial training of future teachers in order to promote methodological innovation in Pre-school Education and in the 1st and 2nd Cycles of Basic Education. A qualitative methodology was used, in which a questionnaire survey was applied to teachers in initial training. For data analysis, the techniques of content analysis with the support of NVivo software were used. The results point to the following aspects: a) future teachers recognize that they have more technical knowledge about ICT than pedagogical knowledge. This result makes sense if we consider the objective of Basic Education, so that the gaps can be filled in the Master's Course by students who wish to follow the teaching; b) the respondents are aware that the integration of digital resources contributes positively to students' learning and to the life of children and young people, which also promotes preparation in life; c) to be a teacher in the digital age there is a need for the development of digital literacy, lifelong learning and the adoption of new ways of teaching how to learn. Thus, this study aims to contribute to a reflection on the teaching profession in the digital age.
Keywords: Digital technologies, initial teacher training, pedagogical use of ICT, skills.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 600