Search results for: parallel evolution
2386 Black-Hole Dimension: A Distinct Methodology of Understanding Time, Space and Data in Architecture
Authors: Alp Arda
Abstract:
Inspired by Nolan's ‘Interstellar’, this paper delves into speculative architecture, asking, ‘What if an architect could traverse time to study a city?’ It unveils the ‘Black-Hole Dimension,’ a groundbreaking concept that redefines urban identities beyond traditional boundaries. Moving past linear time narratives, this approach draws from the gravitational dynamics of black holes to enrich our understanding of urban and architectural progress. By envisioning cities and structures as influenced by black hole-like forces, it enables an in-depth examination of their evolution through time and space. The Black-Hole Dimension promotes a temporal exploration of architecture, treating spaces as narratives of their current state interwoven with historical layers. It advocates for viewing architectural development as a continuous, interconnected journey molded by cultural, economic, and technological shifts. This approach not only deepens our understanding of urban evolution but also empowers architects and urban planners to create designs that are both adaptable and resilient. Echoing themes from popular culture and science fiction, this methodology integrates the captivating dynamics of time and space into architectural analysis, challenging established design conventions. The Black-Hole Dimension champions a philosophy that welcomes unpredictability and complexity, thereby fostering innovation in design. In essence, the Black-Hole Dimension revolutionizes architectural thought by emphasizing space-time as a fundamental dimension. It reimagines our built environments as vibrant, evolving entities shaped by the relentless forces of time, space, and data. This groundbreaking approach heralds a future in architecture where the complexity of reality is acknowledged and embraced, leading to the creation of spaces that are both responsive to their temporal context and resilient against the unfolding tapestry of time.Keywords: black-hole, timeline, urbanism, space and time, speculative architecture
Procedia PDF Downloads 702385 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 2692384 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 1162383 Cloud Design for Storing Large Amount of Data
Authors: M. Strémy, P. Závacký, P. Cuninka, M. Juhás
Abstract:
Main goal of this paper is to introduce our design of private cloud for storing large amount of data, especially pictures, and to provide good technological backend for data analysis based on parallel processing and business intelligence. We have tested hypervisors, cloud management tools, storage for storing all data and Hadoop to provide data analysis on unstructured data. Providing high availability, virtual network management, logical separation of projects and also rapid deployment of physical servers to our environment was also needed.Keywords: cloud, glusterfs, hadoop, juju, kvm, maas, openstack, virtualization
Procedia PDF Downloads 3512382 Time, Uncertainty, and Technological Innovation
Authors: Xavier Everaert
Abstract:
Ever since the publication of “The Problem of Social” cost, Coasean insights on externalities, transaction costs, and the reciprocal nature of harms, have been widely debated. What has been largely neglected however, is the role of technological innovation in the mitigation of negative externalities or transaction costs. Incorporating future uncertainty about negligence standards or expected restitution costs and the profit opportunities these uncertainties reveal to entrepreneurs, allow us to frame problems regarding social costs within the reality of rapid technological evolution.Keywords: environmental law and economics, entrepreneurship, commons, pollution, wildlife
Procedia PDF Downloads 4192381 Quality is the Matter of All
Authors: Mohamed Hamza, Alex Ohoussou
Abstract:
At JAWDA, our primary focus is on ensuring the satisfaction of our clients worldwide. We are committed to delivering new features on our SaaS platform as quickly as possible while maintaining high-quality standards. In this paper, we highlight two key aspects of testing that represent an evolution of current methods and a potential trend for the future, which have enabled us to uphold our commitment effectively. These aspects are: "One Sandbox per Pull Request" (dynamic test environments instead of static ones) and "QA for All.".Keywords: QA for all, dynamic sandboxes, QAOPS, CICD, continuous testing, all testers, QA matters for all, 1 sandbox per PR, utilization rate, coverage rate
Procedia PDF Downloads 282380 Ant-Tracking Attribute: A Model for Understanding Production Response
Authors: Prince Suka Neekia Momta, Rita Iheoma Achonyeulo
Abstract:
Ant Tracking seismic attribute applied over 4-seconds seismic volume revealed structural features triggered by clay diapirism, growth fault development, rapid deltaic sedimentation and intense drilling. The attribute was extracted on vertical seismic sections and time slices. Mega tectonic structures such as growth faults and clay diapirs are visible on vertical sections with obscured minor lineaments or fractures. Fractures are distinctively visible on time slices yielding recognizable patterns corroborating established geologic models. This model seismic attribute enabled the understanding of fluid flow characteristics and production responses. Three structural patterns recognized in the field include: major growth faults, minor faults or lineaments and network of fractures. Three growth faults mapped on seismic section form major deformation bands delimiting the area into three blocks or depocenters. The growth faults trend E-W, dip down-to-south in the basin direction, and cut across the study area. The faults initiating from about 2000ms extended up to 500ms, and tend to progress parallel and opposite to the growth direction of an upsurging diapiric structure. The diapiric structures form the major deformational bands originating from great depths (below 2000ms) and rising to about 1200ms where series of sedimentary layers onlapped and pinchout stratigraphically against the diapir. Several other secondary faults or lineaments that form parallel streaks to one another also accompanied the growth faults. The fracture networks have no particular trend but form a network surrounding the well area. Faults identified in the study area have potentials for structural hydrocarbon traps whereas the presence of fractures created a fractured-reservoir condition that enhanced rapid fluid flow especially water. High aquifer flow potential aided by possible fracture permeability resulted in rapid decline in oil rate. Through the application of Ant Tracking attribute, it is possible to obtain detailed interpretation of structures that can have direct influence on oil and gas production.Keywords: seismic, attributes, production, structural
Procedia PDF Downloads 682379 Learning Recomposition after the Remote Period with Finalist Students of the Technical Course in the Environment of the Ifpa, Paragominas Campus, Pará State, Brazilian Amazon
Authors: Liz Carmem Silva-Pereira, Raffael Alencar Mesquita Rodrigues, Francisco Helton Mendes Barbosa, Emerson de Freitas Ferreira
Abstract:
Due to the Covid-19 pandemic declared in March 2020 by the World Health Organization, the way of social coexistence across the planet was affected, especially in educational processes, from the implementation of the remote modality as a teaching strategy. This teaching-learning modality caused a change in the routine and learning of basic education students, which resulted in serious consequences for the return to face-to-face teaching in 2021. 2022, at the Federal Institute of Education, Science and Technology of Pará (IFPA) – Campus Paragominas had their training process severely affected, having studied the initial half of their training in the remote modality, which compromised the carrying out of practical classes, technical visits and field classes, essential for the student formation on the environmental technician. With the objective of promoting the recomposition of these students' learning after returning to the face-to-face modality, an educational strategy was developed in the last period of the course. As teaching methodologies were used for research as an educational principle, the integrative project and the parallel recovery action applied jointly, aiming at recomposing the basic knowledge of the natural sciences, together with the technical knowledge of the environmental area applied to the course. The project assisted 58 finalist students of the environmental technical course. A research instrument was elaborated with parameters of evaluation of the environmental quality for study in 19 collection points, in the Uraim River urban hydrographic basin, in the Paragominas City – Pará – Brazilian Amazon. Students were separated into groups under the professors' and laboratory assistants’ orientation, and in the field, they observed and evaluated the places' environmental conditions and collected physical data and water samples, which were taken to the chemistry and biology laboratories at Campus Paragominas for further analysis. With the results obtained, each group prepared a technical report on the environmental conditions of each evaluated point. This work methodology enabled the practical application of theoretical knowledge received in various disciplines during the remote teaching modality, contemplating the integration of knowledge, people, skills, and abilities for the best technical training of finalist students. At the activity end, the satisfaction of the involved students in the project was evaluated, through a form, with the signing of the informed consent term, using the Likert scale as an evaluation parameter. The results obtained in the satisfaction survey were: on the use of research projects within the disciplines attended, 82% of satisfaction was obtained; regarding the revision of contents in the execution of the project, 84% of satisfaction was obtained; regarding the acquired field experience, 76.9% of satisfaction was obtained, regarding the laboratory experience, 86.2% of satisfaction was obtained, and regarding the use of this methodology as parallel recovery, 71.8% was obtained of satisfaction. In addition to the excellent performance of students in acquiring knowledge, it was possible to remedy the deficiencies caused by the absence of practical classes, technical visits, and field classes, which occurred during the execution of the remote teaching modality, fulfilling the desired educational recomposition.Keywords: integrative project, parallel recovery, research as an educational principle, teaching-learning
Procedia PDF Downloads 632378 Transverse Momentum Dependent Factorization and Evolution for Spin Physics
Authors: Bipin Popat Sonawane
Abstract:
After 1988 Electron muon Collaboration (EMC) announcement of measurement of spin dependent structure function, it has been found that it has become a need to understand spin structure of a hadron. In the study of three-dimensional spin structure of a proton, we need to understand the foundation of quantum field theory in terms of electro-weak and strong theories using rigorous mathematical theories and models. In the process of understanding the inner dynamical stricture of proton we need understand the mathematical formalism in perturbative quantum chromodynamics (pQCD). In QCD processes like proton-proton collision at high energy we calculate cross section using conventional collinear factorization schemes. In this calculations, parton distribution functions (PDFs) and fragmentation function are used which provide the information about probability density of finding quarks and gluons ( partons) inside the proton and probability density of finding final hadronic state from initial partons. In transverse momentum dependent (TMD) PDFs and FFs, collectively called as TMDs, take an account for intrinsic transverse motion of partons. The TMD factorization in the calculation of cross sections provide a scheme of hadronic and partonic states in the given QCD process. In this study we review Transverse Momentum Dependent (TMD) factorization scheme using Collins-Soper-Sterman (CSS) Formalism. CSS formalism considers the transverse momentum dependence of the partons, in this formalism the cross section is written as a Fourier transform over a transverse position variable which has physical interpretation as impact parameter. Along with this we compare this formalism with improved CSS formalism. In this work we study the TMD evolution schemes and their comparison with other schemes. This would provide description in the process of measurement of transverse single spin asymmetry (TSSA) in hadro-production and electro-production of J/psi meson at RHIC, LHC, ILC energy scales. This would surely help us to understand J/psi production mechanism which is an appropriate test of QCD. Procedia PDF Downloads 692377 Modified Montgomery for RSA Cryptosystem
Authors: Rupali Verma, Maitreyee Dutta, Renu Vig
Abstract:
Encryption and decryption in RSA are done by modular exponentiation which is achieved by repeated modular multiplication. Hence, efficiency of modular multiplication directly determines the efficiency of RSA cryptosystem. This paper designs a Modified Montgomery Modular multiplication in which addition of operands is computed by 4:2 compressor. The basic logic operations in addition are partitioned over two iterations such that parallel computations are performed. This reduces the critical path delay of proposed Montgomery design. The proposed design and RSA are implemented on Virtex 2 and Virtex 5 FPGAs. The two factors partitioning and parallelism have improved the frequency and throughput of proposed design.Keywords: RSA, montgomery modular multiplication, 4:2 compressor, FPGA
Procedia PDF Downloads 4112376 Heat and Mass Transfer of an Oscillating Flow in a Porous Channel with Chemical Reaction
Authors: Zahra Neffah, Henda Kahalerras
Abstract:
A numerical study is made in a parallel-plate porous channel subjected to an oscillating flow and an exothermic chemical reaction on its walls. The flow field in the porous region is modeled by the Darcy–Brinkman–Forchheimer model and the finite volume method is used to solve the governing equations. The effects of the modified Frank-Kamenetskii (FKm) and Damköhler (Dm) numbers, the amplitude of oscillation (A), and the Strouhal number (St) are examined. The main results show an increase of heat and mass transfer rates with A and St, and their decrease with FKm and Dm.Keywords: chemical reaction, heat and mass transfer, oscillating flow, porous channel
Procedia PDF Downloads 4112375 Proposal of a Rectenna Built by Using Paper as a Dielectric Substrate for Electromagnetic Energy Harvesting
Authors: Ursula D. C. Resende, Yan G. Santos, Lucas M. de O. Andrade
Abstract:
The recent and fast development of the internet, wireless, telecommunication technologies and low-power electronic devices has led to an expressive amount of electromagnetic energy available in the environment and the smart applications technology expansion. These applications have been used in the Internet of Things devices, 4G and 5G solutions. The main feature of this technology is the use of the wireless sensor. Although these sensors are low-power loads, their use imposes huge challenges in terms of an efficient and reliable way for power supply in order to avoid the traditional battery. The radio frequency based energy harvesting technology is especially suitable to wireless power sensors by using a rectenna since it can be completely integrated into the distributed hosting sensors structure, reducing its cost, maintenance and environmental impact. The rectenna is an equipment composed of an antenna and a rectifier circuit. The antenna function is to collect as much radio frequency radiation as possible and transfer it to the rectifier, which is a nonlinear circuit, that converts the very low input radio frequency energy into direct current voltage. In this work, a set of rectennas, mounted on a paper substrate, which can be used for the inner coating of buildings and simultaneously harvest electromagnetic energy from the environment, is proposed. Each proposed individual rectenna is composed of a 2.45 GHz patch antenna and a voltage doubler rectifier circuit, built in the same paper substrate. The antenna contains a rectangular radiator element and a microstrip transmission line that was projected and optimized by using the Computer Simulation Software (CST) in order to obtain values of S11 parameter below -10 dB in 2.45 GHz. In order to increase the amount of harvested power, eight individual rectennas, incorporating metamaterial cells, were connected in parallel forming a system, denominated Electromagnetic Wall (EW). In order to evaluate the EW performance, it was positioned at a variable distance from the internet router, and a 27 kΩ resistive load was fed. The results obtained showed that if more than one rectenna is associated in parallel, enough power level can be achieved in order to feed very low consumption sensors. The 0.12 m2 EW proposed in this work was able to harvest 0.6 mW from the environment. It also observed that the use of metamaterial structures provide an expressive growth in the amount of electromagnetic energy harvested, which was increased from 0. 2mW to 0.6 mW.Keywords: electromagnetic energy harvesting, metamaterial, rectenna, rectifier circuit
Procedia PDF Downloads 1652374 Enhancement Effect of Electromagnetic Field on Separation of Edible Oil from Oil-Water Emulsion
Authors: Olfat A. Fadali, Mohamed S. Mahmoud, Omnia H. Abdelraheem, Shimaa G. Mohammed
Abstract:
The effect of electromagnetic field (EMF) on the removal of edible oil from oil-in-water emulsion by means of electrocoagulation was investigated in rectangular batch electrochemical cell with DC current. Iron (Fe) plate anodes and stainless steel cathodes were employed as electrodes. The effect of different magnetic field intensities (1.9, 3.9 and 5.2 tesla), three different positions of EMF (below, perpendicular and parallel to the electrocoagulation cell), as well as operating time; had been investigated. The application of electromagnetic field (5.2 tesla) raises percentage of oil removal from 72.4% for traditional electrocoagulation to 90.8% after 20 min.Keywords: electrocoagulation, electromagnetic field, Oil-water emulsion, edible oil
Procedia PDF Downloads 5302373 Temperature Investigations in Two Type of Crimped Connection Using Experimental Determinations
Authors: C. F. Ocoleanu, A. I. Dolan, G. Cividjian, S. Teodorescu
Abstract:
In this paper we make a temperature investigations in two type of superposed crimped connections using experimental determinations. All the samples use 8 copper wire 7.1 x 3 mm2 crimped by two methods: the first method uses one crimp indents and the second is a proposed method with two crimp indents. The ferrule is a parallel one. We study the influence of number and position of crimp indents. The samples are heated in A.C. current at different current values until steady state heating regime. After obtaining of temperature values, we compare them and present the conclusion.Keywords: crimped connections, experimental determinations, temperature, heat transfer
Procedia PDF Downloads 2692372 Evolution of Floating Photovoltaic System Technology and Future Prospect
Authors: Young-Kwan Choi, Han-Sang Jeong
Abstract:
Floating photovoltaic system is a technology that combines photovoltaic power generation with floating structure. However, since floating technology has not been utilized in photovoltaic generation, there are no standardized criteria. It is separately developed and used by different installation bodies. This paper aims to discuss the change of floating photovoltaic system technology based on examples of floating photovoltaic systems installed in Korea.Keywords: floating photovoltaic system, floating PV installation, ocean floating photovoltaic system, tracking type floating photovoltaic system
Procedia PDF Downloads 5592371 Error Estimation for the Reconstruction Algorithm with Fan Beam Geometry
Authors: Nirmal Yadav, Tanuja Srivastava
Abstract:
Shannon theory is an exact method to recover a band limited signals from its sampled values in discrete implementation, using sinc interpolators. But sinc based results are not much satisfactory for band-limited calculations so that convolution with window function, having compact support, has been introduced. Convolution Backprojection algorithm with window function is an approximation algorithm. In this paper, the error has been calculated, arises due to this approximation nature of reconstruction algorithm. This result will be defined for fan beam projection data which is more faster than parallel beam projection.Keywords: computed tomography, convolution backprojection, radon transform, fan beam
Procedia PDF Downloads 4902370 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink
Authors: Sanjay Rathee, Arti Kashyap
Abstract:
Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining
Procedia PDF Downloads 2942369 Evaluation of Teaching Team Stress Factors in Two Engineering Education Programs
Authors: Kari Bjorn
Abstract:
Team learning has been studied and modeled as double loop model and its variations. Also, metacognition has been suggested as a concept to describe the nature of team learning to be more than a simple sum of individual learning of the team members. Team learning has a positive correlation with both individual motivation of its members, as well as the collective factors within the team. Team learning of previously very independent members of two teaching teams is analyzed. Applied Science Universities are training future professionals with ever more diversified and multidisciplinary skills. The size of the units of teaching and learning are increasingly larger for several reasons. First, multi-disciplinary skill development requires more active learning and richer learning environments and learning experiences. This occurs on students teams. Secondly, teaching of multidisciplinary skills requires a multidisciplinary and team-based teaching from the teachers as well. Team formation phases have been identifies and widely accepted. Team role stress has been analyzed in project teams. Projects typically have a well-defined goal and organization. This paper explores team stress of two teacher teams in a parallel running two course units in engineering education. The first is an Industrial Automation Technology and the second is Development of Medical Devices. The courses have a separate student group, and they are in different campuses. Both are run in parallel within 8 week time. Both of them are taught by a group of four teachers with several years of teaching experience, but individually. The team role stress scale items - the survey is done to both teaching groups at the beginning of the course and at the end of the course. The inventory of questions covers the factors of ambiguity, conflict, quantitative role overload and qualitative role overload. Some comparison to the study on project teams can be drawn. Team development stage of the two teaching groups is different. Relating the team role stress factors to the development stage of the group can reveal the potential of management actions to promote team building and to understand the maturity of functional and well-established teams. Mature teams indicate higher job satisfaction and deliver higher performance. Especially, teaching teams who deliver highly intangible results of learning outcome are sensitive to issues in the job satisfaction and team conflicts. Because team teaching is increasing, the paper provides a review of the relevant theories and initial comparative and longitudinal results of the team role stress factors applied to teaching teams.Keywords: engineering education, stress, team role, team teaching
Procedia PDF Downloads 2242368 Using Photogrammetric Techniques to Map the Mars Surface
Authors: Ahmed Elaksher, Islam Omar
Abstract:
For many years, Mars surface has been a mystery for scientists. Lately with the help of geospatial data and photogrammetric procedures researchers were able to capture some insights about this planet. Two of the most imperative data sources to explore Mars are the The High Resolution Imaging Science Experiment (HiRISE) and the Mars Orbiter Laser Altimeter (MOLA). HiRISE is one of six science instruments carried by the Mars Reconnaissance Orbiter, launched August 12, 2005, and managed by NASA. The MOLA sensor is a laser altimeter carried by the Mars Global Surveyor (MGS) and launched on November 7, 1996. In this project, we used MOLA-based DEMs to orthorectify HiRISE optical images for generating a more accurate and trustful surface of Mars. The MOLA data was interpolated using the kriging interpolation technique. Corresponding tie points were digitized from both datasets. These points were employed in co-registering both datasets using GIS analysis tools. In this project, we employed three different 3D to 2D transformation models. These are the parallel projection (3D affine) transformation model; the extended parallel projection transformation model; the Direct Linear Transformation (DLT) model. A set of tie-points was digitized from both datasets. These points were split into two sets: Ground Control Points (GCPs), used to evaluate the transformation parameters using least squares adjustment techniques, and check points (ChkPs) to evaluate the computed transformation parameters. Results were evaluated using the RMSEs between the precise horizontal coordinates of the digitized check points and those estimated through the transformation models using the computed transformation parameters. For each set of GCPs, three different configurations of GCPs and check points were tested, and average RMSEs are reported. It was found that for the 2D transformation models, average RMSEs were in the range of five meters. Increasing the number of GCPs from six to ten points improve the accuracy of the results with about two and half meters. Further increasing the number of GCPs didn’t improve the results significantly. Using the 3D to 2D transformation parameters provided three to two meters accuracy. Best results were reported using the DLT transformation model. However, increasing the number of GCPS didn’t have substantial effect. The results support the use of the DLT model as it provides the required accuracy for ASPRS large scale mapping standards. However, well distributed sets of GCPs is a key to provide such accuracy. The model is simple to apply and doesn’t need substantial computations.Keywords: mars, photogrammetry, MOLA, HiRISE
Procedia PDF Downloads 562367 A Model for Analysis the Induced Voltage of 115 kV On-Line Acting on Neighboring 22 kV Off-Line
Authors: Sakhon Woothipatanapan, Surasit Prakobkit
Abstract:
This paper presents a model for analysis the induced voltage of transmission lines (energized) acting on neighboring distribution lines (de-energized). From environmental restrictions, 22 kV distribution lines need to be installed under 115 kV transmission lines. With the installation of the two parallel circuits like this, they make the induced voltage which can cause harm to operators. This work was performed with the ATP-EMTP modeling to analyze such phenomenon before field testing. Simulation results are used to find solutions to prevent danger to operators who are on the pole.Keywords: transmission system, distribution system, induced voltage, off-line operation
Procedia PDF Downloads 6052366 On the Approximate Solution of Continuous Coefficients for Solving Third Order Ordinary Differential Equations
Authors: A. M. Sagir
Abstract:
This paper derived four newly schemes which are combined in order to form an accurate and efficient block method for parallel or sequential solution of third order ordinary differential equations of the form y^'''= f(x,y,y^',y^'' ), y(α)=y_0,〖y〗^' (α)=β,y^('' ) (α)=μ with associated initial or boundary conditions. The implementation strategies of the derived method have shown that the block method is found to be consistent, zero stable and hence convergent. The derived schemes were tested on stiff and non-stiff ordinary differential equations, and the numerical results obtained compared favorably with the exact solution.Keywords: block method, hybrid, linear multistep, self-starting, third order ordinary differential equations
Procedia PDF Downloads 2692365 Growth of Droplet in Radiation-Induced Plasma of Own Vapour
Authors: P. Selyshchev
Abstract:
The theoretical approach is developed to describe the change of drops in the atmosphere of own steam and buffer gas under irradiation. It is shown that the irradiation influences on size of stable droplet and on the conditions under which the droplet exists. Under irradiation the change of drop becomes more complex: the not monotone and periodical change of size of drop becomes possible. All possible solutions are represented by means of phase portrait. It is found all qualitatively different phase portraits as function of critical parameters: rate generation of clusters and substance density.Keywords: irradiation, steam, plasma, cluster formation, liquid droplets, evolution
Procedia PDF Downloads 4392364 Chemotrophic Signal Exchange between the Host Plant Helianthemum sessiliflorum and Terfezia boudieri
Authors: S. Ben-Shabat, T. Turgeman, O. Leubinski, N. Roth-Bejerano, V. Kagan-Zur, Y. Sitrit
Abstract:
The ectomycorrhizal (ECM) desert truffle Terfezia boudieri produces edible fruit bodies and forms symbiosis with its host plant Helianthemum sessiliflorum (Cistaceae) in the Negev desert of Israel. The symbiosis is vital for both partners' survival under desert conditions. Under desert habitat conditions, ECMs must form symbiosis before entering the dry season. To secure a successful encounter, in the course of evolution, both partners have responded by evolving special signals exchange that facilitates recognition. Members of the Cistaceae family serve as host plants for many important truffles. Conceivably, during evolution a common molecule present in Cistaceae plants was recruited to facilitate successful encounter with ectomycorrhizas. Arbuscular vesicular fungi (AM) are promiscuous in host preferences, in contrast, ECM fungi show specificity to host plants. Accordingly, we hypothesize that H. sessiliflorum secretes a chemotrophic-signaling, which is common to plants hosting ECM fungi belonging to the Pezizales. However, thus far no signaling molecules have been identified in ECM fungi. We developed a bioassay for chemotrophic activity. Fractionation of root exudates revealed a substance with chemotrophic activity and molecular mass of 534. Following the above concept, screening the transcriptome of Terfezia, grown under chemoattraction, discovered genes showing high homology to G proteins-coupled receptors of plant pathogens involved in positive chemotaxis and chemotaxis suppression. This study aimed to identify the active molecule using analytical methods (LC-MS, NMR etc.). This should contribute to our understanding of how ECM fungi communicate with their hosts in the rhizosphere. In line with the ability of Terfezia to form also endomycorrhizal symbiosis like AM fungi, analysis of the mechanisms may likewise be applicable to AM fungi. Developing methods to manipulate fungal growth by the chemoattractant can open new ways to improve inoculation of plants.Keywords: chemotrophic signal, Helianthemum sessiliflorum, Terfezia boudieri, ECM
Procedia PDF Downloads 4082363 An Analytical Approach for the Fracture Characterization in Concrete under Fatigue Loading
Authors: Bineet Kumar
Abstract:
Many civil engineering infrastructures frequently encounter repetitive loading during their service life. Due to the inherent complexity observed in concrete, like quasi-brittle materials, understanding the fatigue behavior in concrete still posesa challenge. Moreover, the fracture process zone characteristics ahead of the crack tip have been observed to be different in fatigue loading than in the monotonic cases. Therefore, it is crucial to comprehend the energy dissipation associated with the fracture process zone (FPZ) due to repetitive loading. It is well known that stiffness degradation due to cyclic loadingprovides a better understanding of the fracture behavior of concrete. Under repetitive load cycles, concrete members exhibit a two-stage stiffness degradation process. Experimentally it has been observed that the stiffness decreases initially with an increase in crack length and subsequently increases. In this work, an attempt has been made to propose an analytical expression to predict energy dissipation and later the stiffness degradation as a function of crack length. Three-point bend specimens have been considered in the present work to derive the formulations. In this approach, the expression for the resultant stress distribution below the neutral axis has been derived by correlating the bending stress with the cohesive stresses developed ahead of the crack tip due to the existence of the fracture process zone. This resultant stress expression is utilized to estimate the dissipated energydue to crack propagation as a function of crack length. Further, the formulation for the stiffness degradation has been developed by relating the dissipated energy with the work done. It can be used to predict the critical crack length and fatigue life. An attempt has been made to understand the influence of stress amplitude on the damage pattern by using the information on the rate of stiffness degradation. It has been demonstrated that with the increase in the stress amplitude, the damage/FPZ proceeds more in the direction of crack propagation compared to the damage in the direction parallel to the span of the beam, which causes a lesser rate of stiffness degradation for the incremental crack length. Further, the effect of loading frequency has been investigated in terms of stiffness degradation. Under low-frequency loading cases, the damage/FPZ has been found to spread more in the direction parallel to the span, in turn reducing the critical crack length and fatigue life. In such a case, a higher rate of stiffness degradation has been observed in comparison to the high-frequency loading case.Keywords: fatigue life, fatigue, fracture, concrete
Procedia PDF Downloads 952362 Bioinformatics High Performance Computation and Big Data
Authors: Javed Mohammed
Abstract:
Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.Keywords: high performance, big data, parallel computation, molecular data, computational biology
Procedia PDF Downloads 3622361 Amplification of electromagnetic pulse by conducting cone
Authors: E. S. Manuylovich, V. A. Astapenko, P. A. Golovinsky
Abstract:
The dispersion relation binding the constant of propagation and frequency is calculated for silver cone. The evolution of the electric field of ultrashort pulse during its propagation in conical structure is considered. Increasing of electric field during pulse propagation to the top of the cone is observed. Reduction of the pulse duration at a certain distance is observed. The dependence of minimum pulse duration on initial chirp and cone angle is investigated.Keywords: ultrashort pulses, surface plasmon polariton, dispersion, silver cone
Procedia PDF Downloads 4332360 Prosthesis Design for Bilateral Hip Disarticulation Management
Authors: Mauricio Plaza, Willian Aperador
Abstract:
Hip disarticulation is an amputation through the hip joint capsule, removing the entire lower extremity, with a closure of the remaining musculature over the exposed acetabulum. Tumors of the distal and proximal femur were treated by total femur resection; a hip disarticulation sometimes is a performance for massive trauma with crush injuries to the lower extremity. This article discusses the design a system for rehabilitation of a patient with bilateral hip disarticulations. The prosthetics designed allowed the patient to do natural gait suspended between parallel articulate crutches with the body weight support between the crutches. The care of this patient was a challenge due to bilateral amputations at such a high level and the special needs of a patient mobility.Keywords: amputation, prosthesis, mobility, hemipelvectomy
Procedia PDF Downloads 4122359 The Origins of Representations: Cognitive and Brain Development
Authors: Athanasios Raftopoulos
Abstract:
In this paper, an attempt is made to explain the evolution or development of human’s representational arsenal from its humble beginnings to its modern abstract symbols. Representations are physical entities that represent something else. To represent a thing (in a general sense of “thing”) means to use in the mind or in an external medium a sign that stands for it. The sign can be used as a proxy of the represented thing when the thing is absent. Representations come in many varieties, from signs that perceptually resemble their representative to abstract symbols that are related to their representata through conventions. Relying the distinction among indices, icons, and symbols, it is explained how symbolic representations gradually emerged from indices and icons. To understand the development or evolution of our representational arsenal, the development of the cognitive capacities that enabled the gradual emergence of representations of increasing complexity and expressive capability should be examined. The examination of these factors should rely on a careful assessment of the available empirical neuroscientific and paleo-anthropological evidence. These pieces of evidence should be synthesized to produce arguments whose conclusions provide clues concerning the developmental process of our representational capabilities. The analysis of the empirical findings in this paper shows that Homo Erectus was able to use both icons and symbols. Icons were used as external representations, while symbols were used in language. The first step in the emergence of representations is that a sensory-motor purely causal schema involved in indices is decoupled from its normal causal sensory-motor functions and serves as a representation of the object that initially called it into play. Sensory-motor schemes are tied to specific contexts of the organism-environment interactions and are activated only within these contexts. For a representation of an object to be possible, this scheme must be de-contextualized so that the same object can be represented in different contexts; a decoupled schema loses its direct ties to reality and becomes mental content. The analysis suggests that symbols emerged due to selection pressures of the social environment. The need to establish and maintain social relationships in ever-enlarging groups that would benefit the group was a sufficient environmental pressure to lead to the appearance of the symbolic capacity. Symbols could serve this need because they can express abstract relationships, such as marriage or monogamy. Icons, by being firmly attached to what can be observed, could not go beyond surface properties to express abstract relations. The cognitive capacities that are required for having iconic and then symbolic representations were present in Homo Erectus, which had a language that started without syntactic rules but was structured so as to mirror the structure of the world. This language became increasingly complex, and grammatical rules started to appear to allow for the construction of more complex expressions required to keep up with the increasing complexity of social niches. This created evolutionary pressures that eventually led to increasing cranial size and restructuring of the brain that allowed more complex representational systems to emerge.Keywords: mental representations, iconic representations, symbols, human evolution
Procedia PDF Downloads 542358 The State of Urban Neighbourhood Research
Authors: Gideon Baffoe
Abstract:
The concept of neighbourhood remains highly relevant in urban studies. However, until now, no attempt has been made to statistically chart the field. This study aims to provide a macroscopic overview using bibliometric analysis of the main characteristics of neighbourhood research in order to understand the academic landscape. The study analyses the emergence and evolution of the concept of neighbourhood in published research, conceptual and intellectual structures as well as scholarship collaboration. It is found that topics related to the local economy of neighbourhoods are sparse, suggesting a major gap in the literature.Keywords: neighbourhood, global south, bibliometric analysis, scholarship
Procedia PDF Downloads 1342357 Opportunities and Challenges: Tracing the Evolution of India's First State-led Curriculum-based Media Literacy Intervention
Authors: Ayush Aditya
Abstract:
In today's digitised world, the extent of an individual’s social involvement is largely determined by their interaction over the internet. The Internet has emerged as a primary source of information consumption and a reliable medium for receiving updates on everyday activities. Owing to this change in the information consumption pattern, the internet has also emerged as a hotbed of misinformation. Experts are of the view that media literacy has emerged as one of the most effective strategies for addressing the issue of misinformation. This paper aims to study the evolution of the Kerala government's media literacy policy, its implementation strategy, challenges and opportunities. The objective of this paper is to create a conceptual framework containing details of the implementation strategy based on the Kerala model. Extensive secondary research of literature, newspaper articles, and other online sources was carried out to locate the timeline of this policy. This was followed by semi-structured interview discussions with government officials from Kerala to trace the origin and evolution of this policy. Preliminary findings based on the collected data suggest that this policy is a case of policy by chance, as the officer who headed this policy during the state level implementation was the one who has already piloted a media literacy program in a district called Kannur as the district collector. Through this paper, an attempt is made to trace the history of the media literacy policy starting from the Kannur intervention in 2018, which was started to address the issue of vaccine hesitancy around measles rubella(MR) vaccination. If not for the vaccine hesitancy, this program would not have been rolled out in Kannur. Interviews with government officials suggest that when authorities decided to take up this initiative in 2020, a huge amount of misinformation emerging during the COVID-19 pandemic was the trigger. There was misinformation regarding government orders, healthcare facilities, vaccination, and lockdown regulations, which affected everyone, unlike the case of Kannur, where it was only a certain age group of kids. As a solution to this problem, the state government decided to create a media literacy curriculum to be taught in all government schools of the state starting from standard 8 till graduation. This was a tricky task, as a new course had to be immediately introduced in the school curriculum amid all the disruptions in the education system caused by the pandemic. It was revealed during the interview that in the case of the state-wide implementation, every step involved multiple checks and balances, unlike the earlier program where stakeholders were roped-in as and when the need emerged. On the pedagogy, while the training during the pilot could be managed through PowerPoint presentation, designing a state-wide curriculum involved multiple iterations and expert approvals. The reason for this is COVID-19 related misinformation has lost its significance. In the next phase of the research, an attempt will be made to compare other aspects of the pilot implementation with the state-wide implementation.Keywords: media literacy, digital media literacy, curriculum based media literacy intervention, misinformation
Procedia PDF Downloads 91