Search results for: performance prism model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25995

Search results for: performance prism model

11595 Characterization of an Extrapolation Chamber for Dosimetry of Low Energy X-Ray Beams

Authors: Fernanda M. Bastos, Teógenes A. da Silva

Abstract:

Extrapolation chambers were designed to be used as primary standard dosimeter for measuring absorbed dose in a medium in beta radiation and low energy x-rays. The International Organization for Standardization established series of reference x-radiation for calibrating and determining the energy dependence of dosimeters that are to be reproduced in metrology laboratories. Standardization of the low energy x-ray beams with tube potential lower than 30 kV may be affected by the instrument used for dosimetry. In this work, parameters of a 23392 model PTW extrapolation chamber were determined aiming its use in low energy x-ray beams as a reference instrument.

Keywords: extrapolation chamber, low energy x-rays, x-ray dosimetry, X-ray metrology

Procedia PDF Downloads 382
11594 The Benefits of Full Day Kindergarten versus Half Day Kindergarten: Review of Literature

Authors: Majedah Fawzy Abu Alrub

Abstract:

The purpose of this study was to assess the benefits of full-day vs. half-day kindergarten. Research suggests that there is a common trend among full-day kindergarten programs. Academic, social, and emotional benefits are evident, as well as preferential trends among the parents and teachers. The review began by identifying 20 references of literature on full-day kindergarten published in the last two decades (1997-2017). Of these, 20 passed an initial screening designed to identify research reports that examined academic, social, and emotional outcomes of full-day kindergarten programs as compared with half-day programs. Studies indicated that children who attend full-day kindergarten are positively related to high performance through their schools. There is much evidence to support a full-day program for children. Results indicated that full-day programs have obvious benefits for children; however, they may not be the best program for all children.

Keywords: preschool, full-day kindergarten, academic benefits, social and emotional benefits

Procedia PDF Downloads 155
11593 Design and Performance Analysis of Advanced B-Spline Algorithm for Image Resolution Enhancement

Authors: M. Z. Kurian, M. V. Chidananda Murthy, H. S. Guruprasad

Abstract:

An approach to super-resolve the low-resolution (LR) image is presented in this paper which is very useful in multimedia communication, medical image enhancement and satellite image enhancement to have a clear view of the information in the image. The proposed Advanced B-Spline method generates a high-resolution (HR) image from single LR image and tries to retain the higher frequency components such as edges in the image. This method uses B-Spline technique and Crispening. This work is evaluated qualitatively and quantitatively using Mean Square Error (MSE) and Peak Signal to Noise Ratio (PSNR). The method is also suitable for real-time applications. Different combinations of decimation and super-resolution algorithms in the presence of different noise and noise factors are tested.

Keywords: advanced b-spline, image super-resolution, mean square error (MSE), peak signal to noise ratio (PSNR), resolution down converter

Procedia PDF Downloads 388
11592 Exergy: An Effective Tool to Quantify Sustainable Development of Biodiesel Production

Authors: Mahmoud Karimi, Golmohammad Khoobbakht

Abstract:

This study focuses on the exergy flow analysis in the transesterification of waste cooking oil with methanol to decrease the consumption of materials and energy and promote the use of renewable resources. The exergy analysis performed is based on the thermodynamic performance parameters namely exergy destruction and exergy efficiency to investigate the effects of variable parameters on renewability of transesterification. The experiment variables were methanol to WCO ratio, catalyst concentration and reaction temperature in the transesterification reaction. The optimum condition with yield of 90.2% and exergy efficiency of 95.2% was obtained at methanol to oil molar ratio of 8:1, 1 wt.% of KOH, at 55 °C. In this condition, the total waste exergy was found to be 45.4 MJ for 1 kg biodiesel production. However high yield in the optimal condition resulted high exergy efficiency in the transesterification of WCO with methanol.

Keywords: biodiesel, exergy, thermodynamic analysis, transesterification, waste cooking oil

Procedia PDF Downloads 183
11591 Optimization of a Hybrid PV-Diesel Mini grid System: A Case Study of Vimtim-Mubi, Nigeria

Authors: Julius Agaka Yusufu

Abstract:

This study undertakes the development of an optimal PV-diesel hybrid power system tailored to the specific energy landscape of Vimtim Mubi, Nigeria, utilizing real-world wind speed, solar radiation, and diesel cost data. Employing HOMER simulation, the research meticulously assesses the technical and financial viability of this hybrid configuration. Additionally, a rigorous performance comparison is conducted between the PV-diesel system and the conventional grid-connected alternative, offering crucial insights into the potential advantages and economic feasibility of adopting hybrid renewable energy solutions in regions grappling with energy access and reliability challenges, with implications for sustainable electrification efforts in similar communities worldwide.

Keywords: Vimtim-Nigeria, homer, renewable energy, PV-diesel hybrid system.

Procedia PDF Downloads 46
11590 Heritage, Cultural Events and Promises for Better Future: Media Strategies for Attracting Tourism during the Arab Spring Uprisings

Authors: Eli Avraham

Abstract:

The Arab Spring was widely covered in the global media and the number of Western tourists traveling to the area began to fall. The goal of this study was to analyze which media strategies marketers in Middle Eastern countries chose to employ in their attempts to repair the negative image of the area in the wake of the Arab Spring. Several studies were published concerning image-restoration strategies of destinations during crises around the globe; however, these strategies were not part of an overarching theory, conceptual framework or model from the fields of crisis communication and image repair. The conceptual framework used in the current study was the ‘multi-step model for altering place image’, which offers three types of strategies: source, message and audience. Three research questions were used: 1.What public relations crisis techniques and advertising campaign components were used? 2. What media policies and relationships with the international media were adopted by Arab officials? 3. Which marketing initiatives (such as cultural and sports events) were promoted? This study is based on qualitative content analysis of four types of data: 1) advertising components (slogans, visuals and text); (2) press interviews with Middle Eastern officials and marketers; (3) official media policy adopted by government decision-maker (e.g. boycotting or arresting newspeople); and (4) marketing initiatives (e.g. organizing heritage festivals and cultural events). The data was located in three channels from December 2010, when the events started, to September 31, 2013: (1) Internet and video-sharing websites: YouTube and Middle Eastern countries' national tourism board websites; (2) News reports from two international media outlets, The New York Times and Ha’aretz; these are considered quality newspapers that focus on foreign news and tend to criticize institutions; (3) Global tourism news websites: eTurbo news and ‘Cities and countries branding’. Using the ‘multi-step model for altering place image,’ the analysis reveals that Middle Eastern marketers and officials used three kinds of strategies to repair their countries' negative image: 1. Source (cooperation and media relations; complying, threatening and blocking the media; and finding alternatives to the traditional media) 2. Message (ignoring, limiting, narrowing or reducing the scale of the crisis; acknowledging the negative effect of an event’s coverage and assuring a better future; promotion of multiple facets, exhibitions and softening the ‘hard’ image; hosting spotlight sporting and cultural events; spinning liabilities into assets; geographic dissociation from the Middle East region; ridicule the existing stereotype) and 3. Audience (changing the target audience by addressing others; emphasizing similarities and relevance to specific target audience). It appears that dealing with their image problems will continue to be a challenge for officials and marketers of Middle Eastern countries until the region stabilizes and its regional conflicts are resolved.

Keywords: Arab spring, cultural events, image repair, Middle East, tourism marketing

Procedia PDF Downloads 269
11589 The Generalized Pareto Distribution as a Model for Sequential Order Statistics

Authors: Mahdy ‎Esmailian, Mahdi ‎Doostparast, Ahmad ‎Parsian

Abstract:

‎In this article‎, ‎sequential order statistics (SOS) censoring type II samples coming from the generalized Pareto distribution are considered‎. ‎Maximum likelihood (ML) estimators of the unknown parameters are derived on the basis of the available multiple SOS data‎. ‎Necessary conditions for existence and uniqueness of the derived ML estimates are given‎. Due to complexity in the proposed likelihood function‎, ‎a useful re-parametrization is suggested‎. ‎For illustrative purposes‎, ‎a Monte Carlo simulation study is conducted and an illustrative example is analysed‎.

Keywords: bayesian estimation‎, generalized pareto distribution‎, ‎maximum likelihood estimation‎, sequential order statistics

Procedia PDF Downloads 493
11588 Cubic Trigonometric B-Spline Approach to Numerical Solution of Wave Equation

Authors: Shazalina Mat Zin, Ahmad Abd. Majid, Ahmad Izani Md. Ismail, Muhammad Abbas

Abstract:

The generalized wave equation models various problems in sciences and engineering. In this paper, a new three-time level implicit approach based on cubic trigonometric B-spline for the approximate solution of wave equation is developed. The usual finite difference approach is used to discretize the time derivative while cubic trigonometric B-spline is applied as an interpolating function in the space dimension. Von Neumann stability analysis is used to analyze the proposed method. Two problems are discussed to exhibit the feasibility and capability of the method. The absolute errors and maximum error are computed to assess the performance of the proposed method. The results were found to be in good agreement with known solutions and with existing schemes in literature.

Keywords: collocation method, cubic trigonometric B-spline, finite difference, wave equation

Procedia PDF Downloads 526
11587 Design and Implementation of Campus Wireless Networking for Sharing Resources in Federal Polytechnic Bauchi, Bauchi State, Nigeria

Authors: Hassan Abubakar

Abstract:

This paper will serve as a guide to good design and implementation of wireless networking for campus institutions in Nigeria. It can be implemented throughout the primary, secondary and tertiary institutions. This paper describe the some technical functions, standard configurations and layouts of the 802.11 wireless LAN(Local Area Network) that can be implemented across the campus network. The paper also touches upon the wireless infrastructure standards involved with enhanced services, such as voice over wireless and wireless guest hotspot. The paper also touch the benefits derived from implementing campus wireless network and share some lights on how to arrive at the success in increasing the performance of wireless and using the campus wireless to share resources like software applications, printer and documents.

Keywords: networking, standards, wireless local area network (WLAN), radio frequency (RF), campus

Procedia PDF Downloads 403
11586 Economic Impacts of Sanctuary and Immigration and Customs Enforcement Policies Inclusive and Exclusive Institutions

Authors: Alexander David Natanson

Abstract:

This paper focuses on the effect of Sanctuary and Immigration and Customs Enforcement (ICE) policies on local economies. "Sanctuary cities" refers to municipal jurisdictions that limit their cooperation with the federal government's efforts to enforce immigration. Using county-level data from the American Community Survey and ICE data on economic indicators from 2006 to 2018, this study isolates the effects of local immigration policies on U.S. counties. The investigation is accomplished by simultaneously studying the policies' effects in counties where immigrants' families are persecuted via collaboration with Immigration and Customs Enforcement (ICE), in contrast to counties that provide protections. The analysis includes a difference-in-difference & two-way fixed effect model. Results are robust to nearest-neighbor matching, after the random assignment of treatment, after running estimations using different cutoffs for immigration policies, and with a regression discontinuity model comparing bordering counties with opposite policies. Results are also robust after restricting the data to a single-year policy adoption, using the Sun and Abraham estimator, and with event-study estimation to deal with the staggered treatment issue. In addition, the study reverses the estimation to understand what drives the decision to choose policies to detect the presence of reverse causality biases in the estimated policy impact on economic factors. The evidence demonstrates that providing protections to undocumented immigrants increases economic activity. The estimates show gains in per capita income ranging from 3.1 to 7.2, median wages between 1.7 to 2.6, and GDP between 2.4 to 4.1 percent. Regarding labor, sanctuary counties saw increases in total employment between 2.3 to 4 percent, and the unemployment rate declined from 12 to 17 percent. The data further shows that ICE policies have no statistically significant effects on income, median wages, or GDP but adverse effects on total employment, with declines from 1 to 2 percent, mostly in rural counties, and an increase in unemployment of around 7 percent in urban counties. In addition, results show a decline in the foreign-born population in ICE counties but no changes in sanctuary counties. The study also finds similar results for sanctuary counties when separating the data between urban, rural, educational attainment, gender, ethnic groups, economic quintiles, and the number of business establishments. The takeaway from this study is that institutional inclusion creates the dynamic nature of an economy, as inclusion allows for economic expansion due to the extension of fundamental freedoms to newcomers. Inclusive policies show positive effects on economic outcomes with no evident increase in population. To make sense of these results, the hypothesis and theoretical model propose that inclusive immigration policies play an essential role in conditioning the effect of immigration by decreasing uncertainties and constraints for immigrants' interaction in their communities, decreasing the cost from fear of deportation or the constant fear of criminalization and optimize their human capital.

Keywords: inclusive and exclusive institutions, post matching, fixed effect, time trend, regression discontinuity, difference-in-difference, randomization inference and sun, Abraham estimator

Procedia PDF Downloads 69
11585 Decentralised Edge Authentication in the Industrial Enterprise IoT Space

Authors: C. P. Autry, A.W. Roscoe

Abstract:

Authentication protocols based on public key infrastructure (PKI) and trusted third party (TTP) are no longer adequate for industrial scale IoT networks thanks to issues such as low compute and power availability, the use of widely distributed and commercial off-the-shelf (COTS) systems, and the increasingly sophisticated attackers and attacks we now have to counter. For example, there is increasing concern about nation-state-based interference and future quantum computing capability. We have examined this space from first principles and have developed several approaches to group and point-to-point authentication for IoT that do not depend on the use of a centralised client-server model. We emphasise the use of quantum resistant primitives such as strong cryptographic hashing and the use multi-factor authentication.

Keywords: authentication, enterprise IoT cybersecurity, PKI/TTP, IoT space

Procedia PDF Downloads 153
11584 Empirical Mode Decomposition Based Denoising by Customized Thresholding

Authors: Wahiba Mohguen, Raïs El’hadi Bekka

Abstract:

This paper presents a denoising method called EMD-Custom that was based on Empirical Mode Decomposition (EMD) and the modified Customized Thresholding Function (Custom) algorithms. EMD was applied to decompose adaptively a noisy signal into intrinsic mode functions (IMFs). Then, all the noisy IMFs got threshold by applying the presented thresholding function to suppress noise and to improve the signal to noise ratio (SNR). The method was tested on simulated data and real ECG signal, and the results were compared to the EMD-Based signal denoising methods using the soft and hard thresholding. The results showed the superior performance of the proposed EMD-Custom denoising over the traditional approach. The performances were evaluated in terms of SNR in dB, and Mean Square Error (MSE).

Keywords: customized thresholding, ECG signal, EMD, hard thresholding, soft-thresholding

Procedia PDF Downloads 292
11583 Adaptive Approach Towards Comprehensive Urban Development Simulation in Coastal Regions: Case Study of New Alamein City, Egypt

Authors: Nada Mohamed, Abdel Aziz Mohamed

Abstract:

Climate change in coastal areas is a global issue that can be felt on local scale and will be around for decades and centuries to come to an end; it also has critical risks on the city’s economy, communities, and the natural environment. One of these changes that cause a huge risk on coastal cities is the sea level rise (SLR). SLR is a result of scarcity and reduction in global environmental system. The main cause of climate change and global warming is the countries with high development index (HDI) as Japan and Germany while the medium and low HDI countries as Egypt does not have enough awareness and advanced tactics to adapt with this changes that destroy urban areas and cause loss in land and economy. This is why Climate Resilience is one of the UN sustainable development goals 2030, which is calling for actions to strengthen climate change resilience through mitigation and adaptation. For many reasons, adaptation has received less attention than mitigation and it is only recently that adaptation has become a focal global point of attention. This adaption can be achieved through some actions such as upgrading the use and the design of the land, adjusting business and activities of people, and increasing community understanding of climate risks. To reach the adaption goals, and we have to apply a strategic pathway to Climate Resilience, which is the Urban Bioregionalism Paradigm. Resiliency has been framed as persistence, adaptation, and transformation. Climate Resilience decision support system includes a visualization platform where ecological, social, and economic information can be viewed alongside with specific geographies that's why Urban Bioregionalism is a socio-ecological system which is defined as a paradigm that has potential to help move social attitudes toward environmental understanding and deepen human-environment connections within ecological development. The research aim is to achieve an adaptive integrated urban development model throughout the analyses of tactics and strategies that can be used to adapt urban areas and coastal communities to the challenges of climate changes especially SLR and also simulation model using advanced technological software for a coastal city corridor to elaborates the suitable strategy to apply.

Keywords: climate resilience, sea level rise, SLR, coastal resilience, adaptive development simulation

Procedia PDF Downloads 124
11582 Interactive Image Search for Mobile Devices

Authors: Komal V. Aher, Sanjay B. Waykar

Abstract:

Nowadays every individual having mobile device with them. In both computer vision and information retrieval Image search is currently hot topic with many applications. The proposed intelligent image search system is fully utilizing multimodal and multi-touch functionalities of smart phones which allows search with Image, Voice, and Text on mobile phones. The system will be more useful for users who already have pictures in their minds but have no proper descriptions or names to address them. The paper gives system with ability to form composite visual query to express user’s intention more clearly which helps to give more precise or appropriate results to user. The proposed algorithm will considerably get better in different aspects. System also uses Context based Image retrieval scheme to give significant outcomes. So system is able to achieve gain in terms of search performance, accuracy and user satisfaction.

Keywords: color space, histogram, mobile device, mobile visual search, multimodal search

Procedia PDF Downloads 357
11581 Estimation of Maximum Earthquake for Gujarat Region, India

Authors: Ashutosh Saxena, Kumar Pallav, Ramji Dwivedi

Abstract:

The present study estimates the seismicity parameter 'b' and maximum possible magnitude of an earthquake (Mmax) for Gujarat region with three well-established methods viz. Kijiko parametric model (KP), Kijiko-Sellevol-Bayern (KSB) and Tapered Gutenberg-Richter (TGR), as a combined seismic source regime. The earthquake catalogue is prepared for a period of 1330 to 2013 in the region Latitudes 20o N to 250 N and Longitudinally extending from 680 to 750 E for earthquake moment magnitude (Mw) ≥4.0. The ’a’ and 'b' value estimated for the region as 4.68 and 0.58. Further, Mmax estimated as 8.54 (± 0.29), 8.69 (± 0.48), and 8.12 with KP, KSB, and TGR, respectively.

Keywords: Mmax, seismicity parameter, Gujarat, Tapered Gutenberg-Richter

Procedia PDF Downloads 525
11580 Simulation of Focusing of Diamagnetic Particles in Ferrofluid Microflows with a Single Set of Overhead Permanent Magnets

Authors: Shuang Chen, Zongqian Shi, Jiajia Sun, Mingjia Li

Abstract:

Microfluidics is a technology that small amounts of fluids are manipulated using channels with dimensions of tens to hundreds of micrometers. At present, this significant technology is required for several applications in some fields, including disease diagnostics, genetic engineering, and environmental monitoring, etc. Among these fields, manipulation of microparticles and cells in microfluidic device, especially separation, have aroused general concern. In magnetic field, the separation methods include positive and negative magnetophoresis. By comparison, negative magnetophoresis is a label-free technology. It has many advantages, e.g., easy operation, low cost, and simple design. Before the separation of particles or cells, focusing them into a single tight stream is usually a necessary upstream operation. In this work, the focusing of diamagnetic particles in ferrofluid microflows with a single set of overhead permanent magnets is investigated numerically. The geometric model of the simulation is based on the configuration of previous experiments. The straight microchannel is 24mm long and has a rectangular cross-section of 100μm in width and 50μm in depth. The spherical diamagnetic particles of 10μm in diameter are suspended into ferrofluid. The initial concentration of the ferrofluid c₀ is 0.096%, and the flow rate of the ferrofluid is 1.8mL/h. The magnetic field is induced by five identical rectangular neodymium−iron− boron permanent magnets (1/8 × 1/8 × 1/8 in.), and it is calculated by equivalent charge source (ECS) method. The flow of the ferrofluid is governed by the Navier–Stokes equations. The trajectories of particles are solved by the discrete phase model (DPM) in the ANSYS FLUENT program. The positions of diamagnetic particles are recorded by transient simulation. Compared with the results of the mentioned experiments, our simulation shows consistent results that diamagnetic particles are gradually focused in ferrofluid under magnetic field. Besides, the diamagnetic particle focusing is studied by varying the flow rate of the ferrofluid. It is in agreement with the experiment that the diamagnetic particle focusing is better with the increase of the flow rate. Furthermore, it is investigated that the diamagnetic particle focusing is affected by other factors, e.g., the width and depth of the microchannel, the concentration of the ferrofluid and the diameter of diamagnetic particles.

Keywords: diamagnetic particle, focusing, microfluidics, permanent magnet

Procedia PDF Downloads 118
11579 Magnetron Sputtered Thin-Film Catalysts with Low Noble Metal Content for Proton Exchange Membrane Water Electrolysis

Authors: Peter Kus, Anna Ostroverkh, Yurii Yakovlev, Yevheniia Lobko, Roman Fiala, Ivan Khalakhan, Vladimir Matolin

Abstract:

Hydrogen economy is a concept of low-emission society which harvests most of its energy from renewable sources (e.g., wind and solar) and in case of overproduction, electrochemically turns the excess amount into hydrogen, which serves as an energy carrier. Proton exchange membrane water electrolyzers (PEMWE) are the backbone of this concept. By fast-response electricity to hydrogen conversion, the PEMWEs will not only stabilize the electrical grid but also provide high-purity hydrogen for variety of fuel cell powered devices, ranging from consumer electronics to vehicles. Wider commercialization of PEMWE technology is however hindered by high prices of noble metals which are necessary for catalyzing the redox reactions within the cell. Namely, platinum for hydrogen evolution reaction (HER), running on cathode, and iridium for oxygen evolution reaction (OER) on anode. Possible way of how to lower the loading of Pt and Ir is by using conductive high-surface nanostructures as catalyst supports in conjunction with thin-film catalyst deposition. The presented study discusses unconventional technique of membrane electron assembly (MEA) preparation. Noble metal catalysts (Pt and Ir) were magnetron sputtered in very low loadings onto the surface of porous sublayers (located on gas diffusion layer or directly on membrane), forming so to say localized three-phase boundary. Ultrasonically sprayed corrosion resistant TiC-based sublayer was used as a support material on anode, whereas magnetron sputtered nanostructured etched nitrogenated carbon (CNx) served the same role on cathode. By using this configuration, we were able to significantly decrease the amount of noble metals (to thickness of just tens of nanometers), while keeping the performance comparable to that of average state-of-the-art catalysts. Complex characterization of prepared supported catalysts includes in-cell performance and durability tests, electrochemical impedance spectroscopy (EIS) as well as scanning electron microscopy (SEM) imaging and X-ray photoelectron spectroscopy (XPS) analysis. Our research proves that magnetron sputtering is a suitable method for thin-film deposition of electrocatalysts. Tested set-up of thin-film supported anode and cathode catalysts with combined loading of just 120 ug.cm⁻² yields remarkable values of specific current. Described approach of thin-film low-loading catalyst deposition might be relevant when noble metal reduction is the topmost priority.

Keywords: hydrogen economy, low-loading catalyst, magnetron sputtering, proton exchange membrane water electrolyzer

Procedia PDF Downloads 151
11578 The Affective Motivation of Women Miners in Ghana

Authors: Adesuwa Omorede, Rufai Haruna Kilu

Abstract:

Affective motivation (motivation that is emotionally laden usually related to affect, passion, emotions, moods) in the workplace stimulates individuals to reinforce, persist and commit to their task, which leads to the individual and organizational performance. This leads individuals to reach goals especially in situations where task are highly challenging and hostile. In such situations, individuals are more disposed to be more creative, innovative and see new opportunities from the loopholes in their workplace. However, when individuals feel displaced and less important, an adverse reaction may suffice which may be detrimental to the organization and its performance. One sector where affective motivation is eminently present and relevant, is the mining industry. Due to its intense work environment; mostly dominated by men and masculinity cultures; and deliberate exclusion of women in this environment which, makes the women working in these environments to feel marginalized. In Ghana, the mining industry is mostly seen as a very physical environment especially underground and mostly considerd as 'no place for a woman'. Despite the fact that these women feel less 'needed' or 'appreciated' in such environments, they still have to juggle between intense work shifts; face violence and other health risks with their families, which put a strain on their affective motivational reaction. Beyond these challenges, however, several mining companies in Ghana today are working towards providing a fair and equal working situation for both men and women miners, by recognizing them as key stakeholders, as well as including them in the stages of mining projects from the planning and designing phase to the evaluation and implementation stage. Drawing from the psychology and gender literature, this study takes a narrative approach to identify and understand the shifting gender dynamics within the mine works in Ghana, occasioning a change in background disposition of miners, which leads to more women taking up mine jobs in the country. In doing so, a qualitative study was conducted using semi-structured interviews from Ghana. Several women working within the mining industries in Ghana shared their experiences and how they felt and still feel in their workplace. In addition, archival documents were gathered to support the findings. The results suggest a change in enrolment regimes in a mining and technology university in Ghana, making room for a more gender equal enrolments in the university. A renowned university that train and feed mine work professional into the industry. The results further acknowledge gender equal and diversity recruitment policies and initiatives among the mining companies of Ghana. This study contributes to the psychology and gender literature by highlighting the hindrances women face in the mining industry as well as highlighting several of their affective reactions towards gender inequality. The study also provides several suggestions for decision makers in the mining industry of what can be done in the future to reduce the gender inequality gap within the industry.

Keywords: affective motivation, gender shape shifting, mining industry, women miners

Procedia PDF Downloads 283
11577 A CFD Analysis of Hydraulic Characteristics of the Rod Bundles in the BREST-OD-300 Wire-Spaced Fuel Assemblies

Authors: Dmitry V. Fomichev, Vladimir V. Solonin

Abstract:

This paper presents the findings from a numerical simulation of the flow in 37-rod fuel assembly models spaced by a double-wire trapezoidal wrapping as applied to the BREST-OD-300 experimental nuclear reactor. Data on a high static pressure distribution within the models, and equations for determining the fuel bundle flow friction factors have been obtained. Recommendations are provided on using the closing turbulence models available in the ANSYS Fluent. A comparative analysis has been performed against the existing empirical equations for determining the flow friction factors. The calculated and experimental data fit has been shown. An analysis into the experimental data and results of the numerical simulation of the BREST-OD-300 fuel rod assembly hydrodynamic performance are presented.

Keywords: BREST-OD-300, ware-spaces, fuel assembly, computation fluid dynamics

Procedia PDF Downloads 366
11576 Introducing Data-Driven Learning into Chinese Higher Education English for Academic Purposes Writing Instructional Settings

Authors: Jingwen Ou

Abstract:

Writing for academic purposes in a second or foreign language is one of the most important and the most demanding skills to be mastered by non-native speakers. Traditionally, the EAP writing instruction at the tertiary level encompasses the teaching of academic genre knowledge, more specifically, the disciplinary writing conventions, the rhetorical functions, and specific linguistic features. However, one of the main sources of challenges in English academic writing for L2 students at the tertiary level can still be found in proficiency in academic discourse, especially vocabulary, academic register, and organization. Data-Driven Learning (DDL) is defined as “a pedagogical approach featuring direct learner engagement with corpus data”. In the past two decades, the rising popularity of the application of the data-driven learning (DDL) approach in the field of EAP writing teaching has been noticed. Such a combination has not only transformed traditional pedagogy aided by published DDL guidebooks in classroom use but also triggered global research on corpus use in EAP classrooms. This study endeavors to delineate a systematic review of research in the intersection of DDL and EAP writing instruction by conducting a systematic literature review on both indirect and direct DDL practice in EAP writing instructional settings in China. Furthermore, the review provides a synthesis of significant discoveries emanating from prior research investigations concerning Chinese university students’ perception of Data-Driven Learning (DDL) and the subsequent impact on their academic writing performance following corpus-based training. Research papers were selected from Scopus-indexed journals and core journals from two main Chinese academic databases (CNKI and Wanfang) published in both English and Chinese over the last ten years based on keyword searches. Results indicated an insufficiency of empirical DDL research despite a noticeable upward trend in corpus research on discourse analysis and indirect corpus applications for material design by language teachers. Research on the direct use of corpora and corpus tools in DDL, particularly in combination with genre-based EAP teaching, remains a relatively small fraction of the whole body of research in Chinese higher education settings. Such scarcity is highly related to the prevailing absence of systematic training in English academic writing registers within most Chinese universities' EAP syllabi due to the Chinese English Medium Instruction policy, where only English major students are mandated to submit English dissertations. Findings also revealed that Chinese learners still held mixed attitudes towards corpus tools influenced by learner differences, limited access to language corpora, and insufficient pre-training on corpus theoretical concepts, despite their improvements in final academic writing performance.

Keywords: corpus linguistics, data-driven learning, EAP, tertiary education in China

Procedia PDF Downloads 37
11575 Value Co-Creation in Used-Car Auctions: A Service Scientific Perspective

Authors: Safdar Muhammad Usman, Youji Kohda, Katsuhiro Umemoto

Abstract:

Electronic market place plays an important intermediary role for connecting dealers and retail customers. The main aim of this paper is to design a value co-creation model in used-car auctions. More specifically, the study has been designed in order to describe the process of value co-creation in used-car auctions, to explore the co-created values in used-car auctions, and finally conclude the paper indicating the future research directions. Our analysis shows that economic values as well as non-economic values are co-created in used-car auctions. In addition, this paper contributes to the academic society broadening the view of value co-creation in service science.

Keywords: value co-creation, used-car auctions, non-financial values, service science

Procedia PDF Downloads 342
11574 Integration of Constraints Related to Composite Materials in the Design of Industrial Products

Authors: A. Boumedine, K. Benfriha, S. Lecheb

Abstract:

Manufacturing methods for products and structures made of composite materials reduce the number of parts and integrate technical functions, this advantage of composite materials leads to a lot of innovation but also to a reduction of costs and a gain in quality. A material has attributes: its density, it’s resistance, it’s cost, it’s resistance to corrosion. For the design of a product, a certain profile of these attributes is required: low density, resistance removed, low cost. The problem is then to identify this attribute profile and to compare it with those of the materials, in order to find the one that comes closest. The aim of this work is to demonstrate the feasibility of characterizing a mini turbine made of 3D printed fiber-filled composite material by the process of additive manufacturing, then compare the performance of the alloy turbine with the composite turbine according to the results of the simulation by Abaqus software.

Keywords: additive manufacturing, composite materials, design, 3D printer, turbine

Procedia PDF Downloads 115
11573 A New Computational Package for Using in CFD and Other Problems (Third Edition)

Authors: Mohammad Reza Akhavan Khaleghi

Abstract:

This paper shows changes done to the Reduced Finite Element Method (RFEM) that its result will be the most powerful numerical method that has been proposed so far (some forms of this method are so powerful that they can approximate the most complex equations simply Laplace equation!). Finite Element Method (FEM) is a powerful numerical method that has been used successfully for the solution of the existing problems in various scientific and engineering fields such as its application in CFD. Many algorithms have been expressed based on FEM, but none have been used in popular CFD software. In this section, full monopoly is according to Finite Volume Method (FVM) due to better efficiency and adaptability with the physics of problems in comparison with FEM. It doesn't seem that FEM could compete with FVM unless it was fundamentally changed. This paper shows those changes and its result will be a powerful method that has much better performance in all subjects in comparison with FVM and another computational method. This method is not to compete with the finite volume method but to replace it.

Keywords: reduced finite element method, new computational package, new finite element formulation, new higher-order form, new isogeometric analysis

Procedia PDF Downloads 99
11572 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 260
11571 Prediction of Alzheimer's Disease Based on Blood Biomarkers and Machine Learning Algorithms

Authors: Man-Yun Liu, Emily Chia-Yu Su

Abstract:

Alzheimer's disease (AD) is the public health crisis of the 21st century. AD is a degenerative brain disease and the most common cause of dementia, a costly disease on the healthcare system. Unfortunately, the cause of AD is poorly understood, furthermore; the treatments of AD so far can only alleviate symptoms rather cure or stop the progress of the disease. Currently, there are several ways to diagnose AD; medical imaging can be used to distinguish between AD, other dementias, and early onset AD, and cerebrospinal fluid (CSF). Compared with other diagnostic tools, blood (plasma) test has advantages as an approach to population-based disease screening because it is simpler, less invasive also cost effective. In our study, we used blood biomarkers dataset of The Alzheimer’s disease Neuroimaging Initiative (ADNI) which was funded by National Institutes of Health (NIH) to do data analysis and develop a prediction model. We used independent analysis of datasets to identify plasma protein biomarkers predicting early onset AD. Firstly, to compare the basic demographic statistics between the cohorts, we used SAS Enterprise Guide to do data preprocessing and statistical analysis. Secondly, we used logistic regression, neural network, decision tree to validate biomarkers by SAS Enterprise Miner. This study generated data from ADNI, contained 146 blood biomarkers from 566 participants. Participants include cognitive normal (healthy), mild cognitive impairment (MCI), and patient suffered Alzheimer’s disease (AD). Participants’ samples were separated into two groups, healthy and MCI, healthy and AD, respectively. We used the two groups to compare important biomarkers of AD and MCI. In preprocessing, we used a t-test to filter 41/47 features between the two groups (healthy and AD, healthy and MCI) before using machine learning algorithms. Then we have built model with 4 machine learning methods, the best AUC of two groups separately are 0.991/0.709. We want to stress the importance that the simple, less invasive, common blood (plasma) test may also early diagnose AD. As our opinion, the result will provide evidence that blood-based biomarkers might be an alternative diagnostics tool before further examination with CSF and medical imaging. A comprehensive study on the differences in blood-based biomarkers between AD patients and healthy subjects is warranted. Early detection of AD progression will allow physicians the opportunity for early intervention and treatment.

Keywords: Alzheimer's disease, blood-based biomarkers, diagnostics, early detection, machine learning

Procedia PDF Downloads 311
11570 Distribution Network Optimization by Optimal Placement of Photovoltaic-Based Distributed Generation: A Case Study of the Nigerian Power System

Authors: Edafe Lucky Okotie, Emmanuel Osawaru Omosigho

Abstract:

This paper examines the impacts of the introduction of distributed energy generation (DEG) technology into the Nigerian power system as an alternative means of energy generation at distribution ends using Otovwodo 15 MVA, 33/11kV injection substation as a case study. The overall idea is to increase the generated energy in the system, improve the voltage profile and reduce system losses. A photovoltaic-based distributed energy generator (PV-DEG) was considered and was optimally placed in the network using Genetic Algorithm (GA) in Mat. Lab/Simulink environment. The results of simulation obtained shows that the dynamic performance of the network was optimized with DEG-grid integration.

Keywords: distributed energy generation (DEG), genetic algorithm (GA), power quality, total load demand, voltage profile

Procedia PDF Downloads 70
11569 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers

Procedia PDF Downloads 48
11568 Understanding Innovation, Mentorship, and Motivation in Teams, a Design-Centric Approach for Undergraduates

Authors: K. Z. Tang, K. Ameek, K. Kuang

Abstract:

Rapid product development cycles and changing economic conditions compel businesses to find new ways to stay relevant and effective. One of the ways which many companies have adopted is to spur innovations within the various team-based units in the organization. It would be relevant and important to ensure our graduates are ready to excel in such evolving conditions within their professional eco-systems. However, it is not easy to understand the interplays of nurturing team innovation and improving students’ learning, in the context of engineering education. In this study, we seek to understand team innovation and explore ways to improve students’ performance and learning, via motivation and mentorship. Learning goals from a group of students are collected during a carefully designed two-week long summer programme to provide insights on the main themes, within the context of learning and working in a team.

Keywords: team innovation, mentorship, motivation, learning

Procedia PDF Downloads 271
11567 Construction of a Fusion Gene Carrying E10A and K5 with 2A Peptide-Linked by Using Overlap Extension PCR

Authors: Tiancheng Lan

Abstract:

E10A is a kind of replication-defective adenovirus which carries the human endostatin gene to inhibit the growth of tumors. Kringle 5(K5) has almost the same function as angiostatin to also inhibit the growth of tumors since they are all the byproduct of the proteolytic cleavage of plasminogen. Tumor size increasing can be suppressed because both of the endostatin and K5 can restrain the angiogenesis process. Therefore, in order to improve the treatment effect on tumor, 2A peptide is used to construct a fusion gene carrying both E10A and K5. Using 2A peptide is an ideal strategy when a fusion gene is expressed because it can avoid many problems during the expression of more than one kind of protein. The overlap extension PCR is also used to connect 2A peptide with E10A and K5. The final construction of fusion gene E10A-2A-K5 can provide a possible new method of the anti-angiogenesis treatment with a better expression performance.

Keywords: E10A, Kringle 5, 2A peptide, overlap extension PCR

Procedia PDF Downloads 136
11566 Intracellular Sphingosine-1-Phosphate Receptor 3 Contributes to Lung Tumor Cell Proliferation

Authors: Michela Terlizzi, Chiara Colarusso, Aldo Pinto, Rosalinda Sorrentino

Abstract:

Sphingosine-1-phosphate (S1P) is a membrane-derived bioactive phospholipid exerting a multitude of effects on respiratory cell physiology and pathology through five S1P receptors (S1PR1-5). Higher levels of S1P have been registered in a broad range of respiratory diseases, including inflammatory disorders and cancer, although its exact role is still elusive. Based on our previous study in which we found that S1P/S1PR3 is involved in an inflammatory pattern via the activation of Toll-like Receptor 9 (TLR9), highly expressed on lung cancer cells, the main goal of the current study was to better understand the involvement of S1P/S1PR3 pathway/signaling during lung carcinogenesis, taking advantage of a mouse model of first-hand smoke exposure and of carcinogen-induced lung cancer. We used human samples of Non-Small Cell Lung Cancer (NSCLC), a mouse model of first-hand smoking, and of Benzo(a)pyrene (BaP)-induced tumor-bearing mice and A549 lung adenocarcinoma cells. We found that the intranuclear, but not the membrane, localization of S1PR3 was associated to the proliferation of lung adenocarcinoma cells, the mechanism that was correlated to human and mouse samples of smoke-exposure and carcinogen-induced lung cancer, which were characterized by higher utilization of S1P. Indeed, the inhibition of the membrane S1PR3 did not alter tumor cell proliferation after TLR9 activation. Instead, according to the nuclear localization of sphingosine kinase (SPHK) II, the enzyme responsible for the catalysis of the S1P last step synthesis, the inhibition of the kinase completely blocked the endogenous S1P-induced tumor cell proliferation. These results prove that the endogenous TLR9-induced S1P can on one side favor pro-inflammatory mechanisms in the tumor microenvironment via the activation of cell surface receptors, but on the other tumor progression via the nuclear S1PR3/SPHK II axis, highlighting a novel molecular mechanism that identifies S1P as one of the crucial mediators for lung carcinogenesis-associated inflammatory processes and that could provide differential therapeutic approaches especially in non-responsive lung cancer patients.

Keywords: sphingosine-1-phosphate (S1P), S1P Receptor 3 (S1PR3), smoking-mice, lung inflammation, lung cancer

Procedia PDF Downloads 188