Search results for: STEP fault
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3479

Search results for: STEP fault

2819 The Effectiveness of Environmental Policy Instruments for Promoting Renewable Energy Consumption: Command-and-Control Policies versus Market-Based Policies

Authors: Mahmoud Hassan

Abstract:

Understanding the impact of market- and non-market-based environmental policy instruments on renewable energy consumption (REC) is crucial for the design and choice of policy packages. This study aims to empirically investigate the effect of environmental policy stringency index (EPS) and its components on REC in 27 OECD countries over the period from 1990 to 2015, and then use the results to identify what the appropriate environmental policy mix should look like. By relying on the two-step system GMM estimator, we provide evidence that increasing environmental policy stringency as a whole promotes renewable energy consumption in these 27 developed economies. Moreover, policymakers are able, through the market- and non-market-based environmental policy instruments, to increase the use of renewable energy. However, not all of these instruments are effective for achieving this goal. The results indicate that R&D subsidies and trading schemes have a positive and significant impact on REC, while taxes, feed-in tariff and emission standards have not a significant effect. Furthermore, R&D subsidies are more effective than trading schemes for stimulating the use of clean energy. These findings proved to be robust across the three alternative panel techniques used.

Keywords: environmental policy stringency, renewable energy consumption, two-step system-GMM estimation, linear dynamic panel data model

Procedia PDF Downloads 180
2818 Developing Teachers as Change Agents: A Qualitative Study of Master of Education Graduates in Pakistan

Authors: Mir Afzal Tajik

Abstract:

The 'Strengthening Teacher Education in Pakistan' (STEP) is an innovative programme jointly funded by the Government of Canada and the Aga Khan Foundation Canada and implemented by the Aga Khan University - Institute for Educational Development (AKU-IED) in partnership with the local governments, education departments and communities in the provinces of Balochistan, Sindh and Gilgit-Baltistan in Pakistan. One of the key components of the programme is the professional development of teachers, headteachers and teacher educators through a variety of teacher education programmes including a two-year Masters of Education (MEd) Programme offered by AKU-IED. A number of teachers, headteachers and teacher educators from these provinces have been developed through the MEd Programme. This paper discusses a qualitative research study conducted to explore the nature, relevance, rigor and richness of the experiences of the MEd graduates, and how these experiences have fostered their own professional development and their ability to bring about positive changes in their schools. The findings of the study provide useful insights into the graduates’ self-actualization, the transformation of their professional beliefs and practices, the difference they have made in their schools, and the challenges they face. The study also provides recommendations for policy and practice related to teacher education programmes.

Keywords: STEP, teacher education, Pakistan, Canada, Aga Khan foundation

Procedia PDF Downloads 347
2817 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances

Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels

Abstract:

The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.

Keywords: prediction model, sensitivity analysis, simulation method, USMLE

Procedia PDF Downloads 339
2816 Aligning Cultural Practices through Information Exchange: A Taxonomy in Global Manufacturing Industry

Authors: Hung Nguyen

Abstract:

With the rise of global supply chain network, the choice of supply chain orientation is critical. The alignment between cultural similarity and supply chain information exchange could help identify appropriate supply chain orientations, which would differentiate the stronger competitors and performers from the weaker ones. Through developing a taxonomy, this study examined whether the choices of action programs and manufacturing performance differ depending on the levels of attainment cultural similarity and information exchange. This study employed statistical tests on a large-scale dataset consisting of 680 manufacturing plants from various cultures and industries. Firms need to align cultural practices with the level of information exchange in order to achieve good overall business performance. There appeared to be consistent three major orientations: the Proactive, the Initiative and the Reactive. Firms are experiencing higher payoffs from various improvements are the ones successful alignment in both information exchange and cultural similarity The findings provide step-by-step decision making for supply chain information exchange and offer guidance especially for global supply chain managers. In including both cultural similarity and information exchange, this paper adds greater comprehensiveness and richness to the supply chain literature.

Keywords: culture, information exchange, supply chain orientation, similarity

Procedia PDF Downloads 359
2815 Formulation of a Rapid Earthquake Risk Ranking Criteria for National Bridges in the National Capital Region Affected by the West Valley Fault Using GIS Data Integration

Authors: George Mariano Soriano

Abstract:

In this study, a Rapid Earthquake Risk Ranking Criteria was formulated by integrating various existing maps and databases by the Department of Public Works and Highways (DPWH) and Philippine Institute of Volcanology and Seismology (PHIVOLCS). Utilizing Geographic Information System (GIS) software, the above-mentioned maps and databases were used in extracting seismic hazard parameters and bridge vulnerability characteristics in order to rank the seismic damage risk rating of bridges in the National Capital Region.

Keywords: bridge, earthquake, GIS, hazard, risk, vulnerability

Procedia PDF Downloads 409
2814 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework

Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi

Abstract:

There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.

Keywords: video lectures, big video data, video retrieval, hadoop

Procedia PDF Downloads 533
2813 A Feature Clustering-Based Sequential Selection Approach for Color Texture Classification

Authors: Mohamed Alimoussa, Alice Porebski, Nicolas Vandenbroucke, Rachid Oulad Haj Thami, Sana El Fkihi

Abstract:

Color and texture are highly discriminant visual cues that provide an essential information in many types of images. Color texture representation and classification is therefore one of the most challenging problems in computer vision and image processing applications. Color textures can be represented in different color spaces by using multiple image descriptors which generate a high dimensional set of texture features. In order to reduce the dimensionality of the feature set, feature selection techniques can be used. The goal of feature selection is to find a relevant subset from an original feature space that can improve the accuracy and efficiency of a classification algorithm. Traditionally, feature selection is focused on removing irrelevant features, neglecting the possible redundancy between relevant ones. This is why some feature selection approaches prefer to use feature clustering analysis to aid and guide the search. These techniques can be divided into two categories. i) Feature clustering-based ranking algorithm uses feature clustering as an analysis that comes before feature ranking. Indeed, after dividing the feature set into groups, these approaches perform a feature ranking in order to select the most discriminant feature of each group. ii) Feature clustering-based subset search algorithms can use feature clustering following one of three strategies; as an initial step that comes before the search, binded and combined with the search or as the search alternative and replacement. In this paper, we propose a new feature clustering-based sequential selection approach for the purpose of color texture representation and classification. Our approach is a three step algorithm. First, irrelevant features are removed from the feature set thanks to a class-correlation measure. Then, introducing a new automatic feature clustering algorithm, the feature set is divided into several feature clusters. Finally, a sequential search algorithm, based on a filter model and a separability measure, builds a relevant and non redundant feature subset: at each step, a feature is selected and features of the same cluster are removed and thus not considered thereafter. This allows to significantly speed up the selection process since large number of redundant features are eliminated at each step. The proposed algorithm uses the clustering algorithm binded and combined with the search. Experiments using a combination of two well known texture descriptors, namely Haralick features extracted from Reduced Size Chromatic Co-occurence Matrices (RSCCMs) and features extracted from Local Binary patterns (LBP) image histograms, on five color texture data sets, Outex, NewBarktex, Parquet, Stex and USPtex demonstrate the efficiency of our method compared to seven of the state of the art methods in terms of accuracy and computation time.

Keywords: feature selection, color texture classification, feature clustering, color LBP, chromatic cooccurrence matrix

Procedia PDF Downloads 136
2812 The Alliance for Grassland Renewal: A Model for Teaching Endophyte Technology

Authors: C. A. Roberts, J. G. Andrae, S. R. Smith, M. H. Poore, C. A. Young, D. W. Hancock, G. J. Pent

Abstract:

To the author’s best knowledge, there are no published reports of effective methods for teaching fescue toxicosis and grass endophyte technology in the USA. To address this need, a group of university scientists, industry representatives, government agents, and livestock producers formed an organization called the Alliance for Grassland Renewal. One goal of the Alliance was to develop a teaching method that could be employed across all regions in the USA and all sectors of the agricultural community. The first step in developing this method was identification of experts who were familiar with the science and management of fescue toxicosis. The second step was curriculum development. Experts wrote a curriculum that addressed all aspects of toxicosis and management, including toxicology, animal nutrition, pasture management, economics, and mycology. The curriculum was created for presentation in lectures, laboratories, and in the field. The curriculum was in that it could be delivered across state lines, regardless of peculiar, in-state recommendations. The curriculum was also unique as it was unanimously supported by private companies otherwise in competition with each other. The final step in developing this teaching method was formulating a delivery plan. All experts, including university, industry, government, and production, volunteered to travel from any state in the USA, converge in one location, teach a 1-day workshop, then travel to the next location. The results of this teaching method indicate widespread success. Since 2012, experts across the entire USA have converged to teach Alliance workshops in Kansas, Oklahoma, Missouri, Kentucky, Georgia, South Carolina, North Carolina, and Virginia, with ongoing workshops in Arkansas and Tennessee. Data from post-workshop surveys indicate that instruction has been effective, as at least 50% of the participants stated their intention to adopt the endophyte technology presented in these workshops. The teaching method developed by the Alliance for Grassland Renewal has proved to be effective, and the Alliance continues to expand across the USA.

Keywords: endophyte, Epichloe coenophiala, ergot alkaloids, fescue toxicosis, tall fescue

Procedia PDF Downloads 121
2811 COVID-19 Teaches Probability Risk Assessment

Authors: Sean Sloan

Abstract:

Probability Risk Assessments (PRA) can be a difficult concept for students to grasp. So in searching for different ways to describe PRA to relate it to their lives; COVID-19 came up. The parallels are amazing. Soon students began analyzing acceptable risk with the virus. This helped them to quantify just how dangerous is dangerous. The original lesson was dismissed and for the remainder of the period, the probability of risk, and the lethality of risk became the topic. Spreading events such as a COVID carrier on an airline became analogous to single fault casualties such as a Tsunami. Odds of spreading became odds of backup-diesel-generator failure – like with Fukashima Daiichi. Fatalities of the disease became expected fatalities due to radiation spread. Quantification from this discussion took it from hyperbole and emotion into one where we could rationally base guidelines. It has been one of the most effective educational devices observed.

Keywords: COVID, education, probability, risk

Procedia PDF Downloads 152
2810 Seismic Impact and Design on Buried Pipelines

Authors: T. Schmitt, J. Rosin, C. Butenweg

Abstract:

Seismic design of buried pipeline systems for energy and water supply is not only important for plant and operational safety, but in particular for the maintenance of supply infrastructure after an earthquake. Past earthquakes have shown the vulnerability of pipeline systems. After the Kobe earthquake in Japan in 1995 for instance, in some regions the water supply was interrupted for almost two months. The present paper shows special issues of the seismic wave impacts on buried pipelines, describes calculation methods, proposes approaches and gives calculation examples. Buried pipelines are exposed to different effects of seismic impacts. This paper regards the effects of transient displacement differences and resulting tensions within the pipeline due to the wave propagation of the earthquake. Other effects are permanent displacements due to fault rupture displacements at the surface, soil liquefaction, landslides and seismic soil compaction. The presented model can also be used to calculate fault rupture induced displacements. Based on a three-dimensional Finite Element Model parameter studies are performed to show the influence of several parameters such as incoming wave angle, wave velocity, soil depth and selected displacement time histories. In the computer model, the interaction between the pipeline and the surrounding soil is modeled with non-linear soil springs. A propagating wave is simulated affecting the pipeline punctually independently in time and space. The resulting stresses mainly are caused by displacement differences of neighboring pipeline segments and by soil-structure interaction. The calculation examples focus on pipeline bends as the most critical parts. Special attention is given to the calculation of long-distance heat pipeline systems. Here, in regular distances expansion bends are arranged to ensure movements of the pipeline due to high temperature. Such expansion bends are usually designed with small bending radii, which in the event of an earthquake lead to high bending stresses at the cross-section of the pipeline. Therefore, Karman's elasticity factors, as well as the stress intensity factors for curved pipe sections, must be taken into account. The seismic verification of the pipeline for wave propagation in the soil can be achieved by observing normative strain criteria. Finally, an interpretation of the results and recommendations are given taking into account the most critical parameters.

Keywords: buried pipeline, earthquake, seismic impact, transient displacement

Procedia PDF Downloads 187
2809 Dyeing of Polyester/Cotton Blends with Reverse-Micelle Encapsulated High Energy Disperse/Reactive Dye Mixture

Authors: Chi-Wai Kan, Yanming Wang, Alan Yiu-Lun Tang, Cheng-Hao Lee Lee

Abstract:

Dyeing of polyester/cotton blend fabrics in various polyester/cotton percentages (32/68, 40/60 and 65/35) was investigated using (poly(ethylene glycol), PEG) based reverse-micelle. High energy disperse dyes and warm type reactive dyes were encapsulated and applied on polyester/cotton blend fabrics in a one bath one step dyeing process. Comparison of reverse micellar-based and aqueous-based (water-based) dyeing was conducted in terms of colour reflectance. Experimental findings revealed that the colour shade of the dyed fabrics in reverse micellar non-aqueous dyeing system at a lower dyeing temperature of 98°C is slightly lighter than that of conventional aqueous dyeing system in two-step process (130oC for disperse dyeing and 70°C for reactive dyeing). The exhaustion of dye in polyester-cotton blend fabrics, in terms of colour reflectance, were found to be highly fluctuated at dyeing temperature of 98°C.

Keywords: one-bath dyeing, polyester/cotton blends, disperse/reactive dyes, reverse micelle

Procedia PDF Downloads 150
2808 Reliability Enhancement by Parameter Design in Ferrite Magnet Process

Authors: Won Jung, Wan Emri

Abstract:

Ferrite magnet is widely used in many automotive components such as motors and alternators. Magnets used inside the components must be in good quality to ensure the high level of performance. The purpose of this study is to design input parameters that optimize the ferrite magnet production process to ensure the quality and reliability of manufactured products. Design of Experiments (DOE) and Statistical Process Control (SPC) are used as mutual supplementations to optimize the process. DOE and SPC are quality tools being used in the industry to monitor and improve the manufacturing process condition. These tools are practically used to maintain the process on target and within the limits of natural variation. A mixed Taguchi method is utilized for optimization purpose as a part of DOE analysis. SPC with proportion data is applied to assess the output parameters to determine the optimal operating conditions. An example of case involving the monitoring and optimization of ferrite magnet process was presented to demonstrate the effectiveness of this approach. Through the utilization of these tools, reliable magnets can be produced by following the step by step procedures of proposed framework. One of the main contributions of this study was producing the crack free magnets by applying the proposed parameter design.

Keywords: ferrite magnet, crack, reliability, process optimization, Taguchi method

Procedia PDF Downloads 517
2807 Climate Adaptive Building Shells for Plus-Energy-Buildings, Designed on Bionic Principles

Authors: Andreas Hammer

Abstract:

Six peculiar architecture designs from the Frankfurt University will be discussed within this paper and their future potential of the adaptable and solar thin-film sheets implemented facades will be shown acting and reacting on climate/solar changes of their specific sites. The different aspects, as well as limitations with regard to technical and functional restrictions, will be named. The design process for a “multi-purpose building”, a “high-rise building refurbishment” and a “biker’s lodge” on the river Rheine valley, has been critically outlined and developed step by step from an international studentship towards an overall energy strategy, that firstly had to push the design to a plus-energy building and secondly had to incorporate bionic aspects into the building skins design. Both main parameters needed to be reviewed and refined during the whole design process. Various basic bionic approaches have been given [e.g. solar ivyᵀᴹ, flectofinᵀᴹ or hygroskinᵀᴹ, which were to experiment with, regarding the use of bendable photovoltaic thin film elements being parts of a hybrid, kinetic façade system.

Keywords: bionic and bioclimatic design, climate adaptive building shells [CABS], energy-strategy, harvesting façade, high-efficiency building skin, photovoltaic in building skins, plus-energy-buildings, solar gain, sustainable building concept

Procedia PDF Downloads 430
2806 FLIME - Fast Low Light Image Enhancement for Real-Time Video

Authors: Vinay P., Srinivas K. S.

Abstract:

Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.

Keywords: low light image enhancement, real-time video, computer vision, machine learning

Procedia PDF Downloads 204
2805 Focalization Used as a Narrative Strategy Mirroring Fadia Faqir’s Ideology in Pillars of Salt 1996

Authors: Malika Hammouche

Abstract:

The novel Pillars of Salt, written by Fadia Faqir in 1996, is a good example where storytelling is utilized as a traditional material to underline the author’s womanist ideology. A study of narrative could be fruitfully combined with that of ideology in this case. This combination could be demonstrated through the narrative technique used by Fadia Faqir in Pillars of Salt (1996), reflecting her anti-colonial ideology. The first step of this work will highlight the storyteller’s narrative in the novel representing, on the one hand, the imperial voice, and on the other exoticism and orientalism. The second step will demonstrate how Faqir’s narrative technique uses focalization as a narratological tool to negotiate her space. Faqir gives a voice to the female protagonist of the novel within the androcentric bias of Arab narrative theory to point to and amend the orientalist discourse typical to colonial literature. The orientalist discourse is represented through the voice of the storyteller in the novel. The juxtaposition of the storyteller’s and the female protagonist narratives is borrowed from the Arab literary background. It is a postcolonial counter-discursive strategy used by the author as a traditional material to underline her Arabo Islamic Womanist ideology in this novel.

Keywords: Arabo Islamic womanism, focalization, ideology, narrative technique, orientalist

Procedia PDF Downloads 240
2804 A World Map of Seabed Sediment Based on 50 Years of Knowledge

Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès

Abstract:

Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.

Keywords: marine sedimentology, seabed map, sediment classification, world ocean

Procedia PDF Downloads 232
2803 Mesoporous Nanocomposites for Sustained Release Applications

Authors: Daniela Istrati, Alina Morosan, Maria Stanca, Bogdan Purcareanu, Adrian Fudulu, Laura Olariu, Alice Buteica, Ion Mindrila, Rodica Cristescu, Dan Eduard Mihaiescu

Abstract:

Our present work is related to the synthesis, characterization and applications of new nanocomposite materials based on silica mesoporous nanocompozites systems. The nanocomposite support was obtained by using a specific step–by–step multilayer structure buildup synthetic route, characterized by XRD (X-Ray Difraction), TEM (Transmission Electron Microscopy), FT-IR (Fourier Transform-Infra Red Spectrometry), BET (Brunauer–Emmett–Teller method) and loaded with Salvia officinalis plant extract obtained by a hydro-alcoholic extraction route. The sustained release of the target compounds was studied by a modified LC method, proving low release profiles, as expected for the high specific surface area support. The obtained results were further correlated with the in vitro / in vivo behavior of the nanocomposite material and recommending the silica mesoporous nanocomposites as good candidates for biomedical applications. Acknowledgements: This study has been funded by the Research Project PN-III-P2-2.1-PTE-2016-0160, 49-PTE / 2016 (PROZECHIMED) and Project Number PN-III-P4-ID-PCE-2016-0884 / 2017.

Keywords: biomedical, mesoporous, nanocomposites, natural products, sustained release

Procedia PDF Downloads 217
2802 Fold and Thrust Belts Seismic Imaging and Interpretation

Authors: Sunjay

Abstract:

Plate tectonics is of very great significance as it represents the spatial relationships of volcanic rock suites at plate margins, the distribution in space and time of the conditions of different metamorphic facies, the scheme of deformation in mountain belts, or orogens, and the association of different types of economic deposit. Orogenic belts are characterized by extensive thrust faulting, movements along large strike-slip fault zones, and extensional deformation that occur deep within continental interiors. Within oceanic areas there also are regions of crustal extension and accretion in the backarc basins that are located on the landward sides of many destructive plate margins.Collisional orogens develop where a continent or island arc collides with a continental margin as a result of subduction. collisional and noncollisional orogens can be explained by differences in the strength and rheology of the continental lithosphere and by processes that influence these properties during orogenesis.Seismic Imaging Difficulties-In triangle zones, several factors reduce the effectiveness of seismic methods. The topography in the central part of the triangle zone is usually rugged and is associated with near-surface velocity inversions which degrade the quality of the seismic image. These characteristics lead to low signal-to-noise ratio, inadequate penetration of energy through overburden, poor geophone coupling with the surface and wave scattering. Depth Seismic Imaging Techniques-Seismic processing relates to the process of altering the seismic data to suppress noise, enhancing the desired signal (higher signal-to-noise ratio) and migrating seismic events to their appropriate location in space and depth. Processing steps generally include analysis of velocities, static corrections, moveout corrections, stacking and migration. Exploration seismology Bow-tie effect -Shadow Zones-areas with no reflections (dead areas). These are called shadow zones and are common in the vicinity of faults and other discontinuous areas in the subsurface. Shadow zones result when energy from a reflector is focused on receivers that produce other traces. As a result, reflectors are not shown in their true positions. Subsurface Discontinuities-Diffractions occur at discontinuities in the subsurface such as faults and velocity discontinuities (as at “bright spot” terminations). Bow-tie effect caused by the two deep-seated synclines. Seismic imaging of thrust faults and structural damage-deepwater thrust belts, Imaging deformation in submarine thrust belts using seismic attributes,Imaging thrust and fault zones using 3D seismic image processing techniques, Balanced structural cross sections seismic interpretation pitfalls checking, The seismic pitfalls can originate due to any or all of the limitations of data acquisition, processing, interpretation of the subsurface geology,Pitfalls and limitations in seismic attribute interpretation of tectonic features, Seismic attributes are routinely used to accelerate and quantify the interpretation of tectonic features in 3D seismic data. Coherence (or variance) cubes delineate the edges of megablocks and faulted strata, curvature delineates folds and flexures, while spectral components delineate lateral changes in thickness and lithology. Carbon capture and geological storage leakage surveillance because fault behave as a seal or a conduit for hydrocarbon transportation to a trap,etc.

Keywords: tectonics, seismic imaging, fold and thrust belts, seismic interpretation

Procedia PDF Downloads 70
2801 Cross Reactivity of Risperidone in Fentanyl Point of Care Devices

Authors: Barry D. Kyle, Jessica Boyd, Robin Pickersgill, Nicole Squires, Cynthia Balion

Abstract:

Background-Aim: Fentanyl is a highly-potent synthetic μ-opioid receptor agonist used for exceptional pain management. Its main metabolite, norfentanyl, is typically present in urine at significantly high concentrations (i.e. ~20%) representing an effective targeting molecule for immunoassay detection. Here, we evaluated the NCSTM One Step Fentanyl Test Device© and the BTNX Rapid ResponseTM Single Drug Test Strip© point of care (POC) test strips targeting norfentanyl (20 ng/ml) and fentanyl (100 ng/ml) molecules for potential risperidone interference. Methods: POC tests calibrated against norfentanyl (20 ng/ml) used [immunochromatographic] lateral flow devices to provide qualitative results within five minutes of urine sample contact. Results were recorded as negative if lines appeared in the test and control regions according to manufacturer’s instructions. Positive results were recorded if no line appeared in the test region (i.e., control line only visible). Pooled patient urine (n=20), that screened negative for drugs of abuse (using NCS One Step Multi-Line Screen) and fentanyl (using BTNX Rapid Response Strip) was used for spiking studies. Urine was spiked with risperidone alone and with combinations of fentanyl, norfentanyl and/or risperidone to evaluate cross-reactivity in each test device. Results: A positive screen result was obtained when 8,000 ng/mL of risperidone was spiked into drug free urine using the NCS test device. Positive screen results were also obtained in spiked urine samples containing fentanyl and norfentanyl combinations below the cut-off concentrations when 4000 ng/mL risperidone was present using the NCS testing device. There were no screen positive test results using the BTNX test strip with up to 8,000 ng/mL alone or in combination with concentrations of fentanyl and norfentanyl below the cut-off. Both devices screened positive when either fentanyl or norfentanyl exceeded the cut-off threshold in the absence and presence of risperidone. Conclusion: We report that urine samples containing risperidone may give a false positive result using the NCS One Step Fentanyl Test Device.

Keywords: fentanyl, interferences, point of care test, Risperidone

Procedia PDF Downloads 274
2800 Development of Fuzzy Logic Control Ontology for E-Learning

Authors: Muhammad Sollehhuddin A. Jalil, Mohd Ibrahim Shapiai, Rubiyah Yusof

Abstract:

Nowadays, ontology is common in many areas like artificial intelligence, bioinformatics, e-commerce, education and many more. Ontology is one of the focus areas in the field of Information Retrieval. The purpose of an ontology is to describe a conceptual representation of concepts and their relationships within a particular domain. In other words, ontology provides a common vocabulary for anyone who needs to share information in the domain. There are several ontology domains in various fields including engineering and non-engineering knowledge. However, there are only a few available ontology for engineering knowledge. Fuzzy logic as engineering knowledge is still not available as ontology domain. In general, fuzzy logic requires step-by-step guidelines and instructions of lab experiments. In this study, we presented domain ontology for Fuzzy Logic Control (FLC) knowledge. We give Table of Content (ToC) with middle strategy based on the Uschold and King method to develop FLC ontology. The proposed framework is developed using Protégé as the ontology tool. The Protégé’s ontology reasoner, known as the Pellet reasoner is then used to validate the presented framework. The presented framework offers better performance based on consistency and classification parameter index. In general, this ontology can provide a platform to anyone who needs to understand FLC knowledge.

Keywords: engineering knowledge, fuzzy logic control ontology, ontology development, table of content

Procedia PDF Downloads 299
2799 Deep Supervision Based-Unet to Detect Buildings Changes from VHR Aerial Imagery

Authors: Shimaa Holail, Tamer Saleh, Xiongwu Xiao

Abstract:

Building change detection (BCD) from satellite imagery is an essential topic in urbanization monitoring, agricultural land management, and updating geospatial databases. Recently, methods for detecting changes based on deep learning have made significant progress and impressive results. However, it has the problem of being insensitive to changes in buildings with complex spectral differences, and the features being extracted are not discriminatory enough, resulting in incomplete buildings and irregular boundaries. To overcome these problems, we propose a dual Siamese network based on the Unet model with the addition of a deep supervision strategy (DS) in this paper. This network consists of a backbone (encoder) based on ImageNet pre-training, a fusion block, and feature pyramid networks (FPN) to enhance the step-by-step information of the changing regions and obtain a more accurate BCD map. To train the proposed method, we created a new dataset (EGY-BCD) of high-resolution and multi-temporal aerial images captured over New Cairo in Egypt to detect building changes for this purpose. The experimental results showed that the proposed method is effective and performs well with the EGY-BCD dataset regarding the overall accuracy, F1-score, and mIoU, which were 91.6 %, 80.1 %, and 73.5 %, respectively.

Keywords: building change detection, deep supervision, semantic segmentation, EGY-BCD dataset

Procedia PDF Downloads 120
2798 Ubiquitous Scaffold Learning Environment Using Problem-based Learning Activities to Enhance Problem-solving Skills and Context Awareness

Authors: Noppadon Phumeechanya, Panita Wannapiroon

Abstract:

The purpose of this research is to design the ubiquitous scaffold learning environment using problem-based learning activities that enhance problem-solving skills and context awareness, and to evaluate the suitability of the ubiquitous scaffold learning environment using problem-based learning activities. We divide the research procedures into two phases. The first phase is to design the ubiquitous scaffold learning environment using problem-based learning activities, and the second is to evaluate the ubiquitous scaffold learning environment using problem-based learning activities. The sample group in this study consists of five experts selected using the purposive sampling method. We analyse data by arithmetic mean and standard deviation. The research findings are as follows; the ubiquitous scaffold learning environment using problem-based learning activities consists of three major steps, the first is preparation before learning. This prepares learners to acknowledge details and learn through u-LMS. The second is the learning process, where learning activities happen in the ubiquitous learning environment and learners learn online with scaffold systems for each step of problem solving. The third step is measurement and evaluation. The experts agree that the ubiquitous scaffold learning environment using problem-based learning activities is highly appropriate.

Keywords: ubiquitous learning environment scaffolding, learning activities, problem-based learning, problem-solving skills, context awareness

Procedia PDF Downloads 498
2797 One-off Separation of Multiple Types of Oil-in-Water Emulsions with Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oil wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM has a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 80
2796 The Cost of Innovation in Software Development Projects

Authors: Mihai Liviu Despa

Abstract:

The paper tackles the topic of determining the cost of innovation in software development projects. Innovation can be achieved either in a planned or unplanned manner. The paper approaches the scenarios were innovation is planned for. As a starting point an innovative software development project is analyzed. The project is depicted step by step as it was implemented, from inception to delivery. Costs that are proprietary to innovation in software development are isolated based on the author’s personal experience in managing the above mentioned project. Innovation costs components identified by the author are then validated using open discussions with software development professionals and projects managers on LinkedIn groups. In order to receive relevant feedback only groups that focus on software development and innovation management are targeted. Additional innovation cost components suggested by software development professionals and projects managers are also considered. Based on the identified cost components an indicator is built. The indicator is meant to formalize the process of determining the cost of innovation in a software development project. The indicator aggregates all the innovation cost components that are identified in the research process. The process of calculating each cost component is also described. Conclusions are formulated and new related research topics are submitted for debate.

Keywords: innovation cost, IT project management, software development, innovation management

Procedia PDF Downloads 460
2795 Hamiltonian Related Properties with and without Faults of the Dual-Cube Interconnection Network and Their Variations

Authors: Shih-Yan Chen, Shin-Shin Kao

Abstract:

In this paper, a thorough review about dual-cubes, DCn, the related studies and their variations are given. DCn was introduced to be a network which retains the pleasing properties of hypercube Qn but has a much smaller diameter. In fact, it is so constructed that the number of vertices of DCn is equal to the number of vertices of Q2n +1. However, each vertex in DCn is adjacent to n + 1 neighbors and so DCn has (n + 1) × 2^2n edges in total, which is roughly half the number of edges of Q2n+1. In addition, the diameter of any DCn is 2n +2, which is of the same order of that of Q2n+1. For selfcompleteness, basic definitions, construction rules and symbols are provided. We chronicle the results, where eleven significant theorems are presented, and include some open problems at the end.

Keywords: dual-cubes, dual-cube extensive networks, dual-cube-like networks, hypercubes, fault-tolerant hamiltonian property

Procedia PDF Downloads 468
2794 Perceptions of College Students on Whether an Intelligent Tutoring System Is a Tutor

Authors: Michael Smalenberger

Abstract:

Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate the benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. Developments improving the ease of ITS creation have recently increased their proliferation, leading many K-12 schools and institutions of higher education in the United States to regularly use ITS within classrooms. We investigated how students perceive their experience using an ITS. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course and were subsequently asked for feedback on their experience. Results show that their perceptions were generally favorable of the ITS, and most would seek to use an ITS both for STEM and non-STEM courses in the future. Along with detailed transaction-level data, this feedback also provides insights on the design of user-friendly interfaces, guidance on accessibility for students with impairments, the sequencing of exercises, students’ expectation of achievement, and comparisons to other tutoring experiences. We discuss how these findings are important for the creation, implementation, and evaluation of ITS as a mode and method of teaching and learning.

Keywords: college statistics course, intelligent tutoring systems, in vivo study, student perceptions of tutoring

Procedia PDF Downloads 101
2793 Design and Analysis of Adaptive Type-I Progressive Hybrid Censoring Plan under Step Stress Partially Accelerated Life Testing Using Competing Risk

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

Statistical distributions have long been employed in the assessment of semiconductor devices and product reliability. The power function-distribution is one of the most important distributions in the modern reliability practice and can be frequently preferred over mathematically more complex distributions, such as the Weibull and the lognormal, because of its simplicity. Moreover, it may exhibit a better fit for failure data and provide more appropriate information about reliability and hazard rates in some circumstances. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests for competing risk based on adoptive type-I progressive hybrid censoring criteria. The life data of the units under test is assumed to follow Mukherjee-Islam distribution. The point and interval maximum-likelihood estimations are obtained for distribution parameters and tampering coefficient. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.

Keywords: adoptive progressive hybrid censoring, competing risk, mukherjee-islam distribution, partially accelerated life testing, simulation study

Procedia PDF Downloads 347
2792 One-off Separation of Multiple Types of Oil-In-Water Emulsions With Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oily wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) which can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM have a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 90
2791 Lessons from Vernacular Architecture for Lightweight Construction

Authors: Alireza Taghdiri, Sara Ghanbarzade Ghomi

Abstract:

With the gravity load reduction in the structural and non-structural components, the lightweight construction will be achieved as well as the improvement of efficiency and functional specifications. The advantages of lightweight construction can be examined in two levels. The first is the mass reduction of load bearing structure which results in increasing internal useful space and the other one is the mass reduction of building which decreases the effects of seismic load as a result. In order to achieve this goal, the essential building materials specifications and also optimum load bearing geometry of structural systems and elements have to be considered, so lightweight materials selection particularly with lightweight aggregate for building components will be the first step of lightweight construction. In the next step, in addition to selecting the prominent samples of Iran's traditional architecture, the process of these works improvement is analyzed through the viewpoints of structural efficiency and lightweighting and also the practical methods of lightweight construction have been extracted. The optimum design of load bearing geometry of structural system has to be considered not only in the structural system elements, but also in their composition and the selection of dimensions, proportions, forms and optimum orientations, can lead to get a maximum materials efficiency for loads and stresses bearing.

Keywords: gravity load, light-weighting structural system, load bearing geometry, seismic behavior

Procedia PDF Downloads 543
2790 An Axiomatic Model for Development of the Allocated Architecture in Systems Engineering Process

Authors: Amir Sharahi, Reza Tehrani, Ali Mollajan

Abstract:

The final step to complete the “Analytical Systems Engineering Process” is the “Allocated Architecture” in which all Functional Requirements (FRs) of an engineering system must be allocated into their corresponding Physical Components (PCs). At this step, any design for developing the system’s allocated architecture in which no clear pattern of assigning the exclusive “responsibility” of each PC for fulfilling the allocated FR(s) can be found is considered a poor design that may cause difficulties in determining the specific PC(s) which has (have) failed to satisfy a given FR successfully. The present study utilizes the Axiomatic Design method principles to mathematically address this problem and establishes an “Axiomatic Model” as a solution for reaching good alternatives for developing the allocated architecture. This study proposes a “loss Function”, as a quantitative criterion to monetarily compare non-ideal designs for developing the allocated architecture and choose the one which imposes relatively lower cost to the system’s stakeholders. For the case-study, we use the existing design of U. S. electricity marketing subsystem, based on data provided by the U.S. Energy Information Administration (EIA). The result for 2012 shows the symptoms of a poor design and ineffectiveness due to coupling among the FRs of this subsystem.

Keywords: allocated architecture, analytical systems engineering process, functional requirements (FRs), physical components (PCs), responsibility of a physical component, system’s stakeholders

Procedia PDF Downloads 408