Search results for: binary number system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25474

Search results for: binary number system

13264 Bibliometric Analysis of the Research Progress on Graphene Inks from 2008 to 2018

Authors: Jean C. A. Sousa, Julio Cesar Maciel Santos, Andressa J. Rubio, Edneia A. S. Paccola, Natália U. Yamaguchi

Abstract:

A bibliometric analysis in the Web of Science database was used to identify overall scientific results of graphene inks to date (2008 to 2018). The objective of this study was to evaluate the evolutionary tendency of graphene inks research and to identify its aspects, aiming to provide data that can guide future work. The contributions of different researches, languages, thematic categories, periodicals, place of publication, institutes, funding agencies, articles cited and applications were analyzed. The results revealed a growing number of annual publications, of 258 papers found, 107 were included because they met the inclusion criteria. Three main applications were identified: synthesis and characterization, electronics and surfaces. The most relevant research on graphene inks has been summarized in this article, and graphene inks for electronic devices presented the most incident theme according to the research trends during the studied period. It is estimated that this theme will remain in evidence and will contribute to the direction of future research in this area.

Keywords: bibliometric, coating, nanomaterials, scientometrics

Procedia PDF Downloads 159
13263 Life Cycle Assessment of Mass Timber Structure, Construction Process as System Boundary

Authors: Mahboobeh Hemmati, Tahar Messadi, Hongmei Gu

Abstract:

Today, life cycle assessment (LCA) is a leading method in mitigating the environmental impacts emerging from the building sector. In this paper, LCA is used to quantify the Green House Gas (GHG) emissions during the construction phase of the largest mass timber residential structure in the United States, Adohi Hall. This building is a 200,000 square foot 708-bed complex located on the campus of the University of Arkansas. The energy used for buildings’ operation is the most dominant source of emissions in the building industry. Lately, however, the efforts were successful at increasing the efficiency of building operation in terms of emissions. As a result, the attention is now shifted to the embodied carbon, which is more noticeable in the building life cycle. Unfortunately, most of the studies have, however, focused on the manufacturing stage, and only a few have addressed to date the construction process. Specifically, less data is available about environmental impacts associated with the construction of mass timber. This study presents, therefore, an assessment of the environmental impact of the construction processes based on the real and newly built mass timber building mentioned above. The system boundary of this study covers modules A4 and A5 based on building LCA standard EN 15978. Module A4 includes material and equipment transportation. Module A5 covers the construction and installation process. This research evolves through 2 stages: first, to quantify materials and equipment deployed in the building, and second, to determine the embodied carbon associated with running equipment for construction materials, both transported to, and installed on, the site where the edifice is built. The Global Warming Potential (GWP) of the building is the primary metric considered in this research. The outcomes of this study bring to the front a better understanding of hotspots in terms of emission during the construction process. Moreover, the comparative analysis of the mass timber construction process with that of a theoretically similar steel building will enable an effective assessment of the environmental efficiency of mass timber.

Keywords: construction process, GWP, LCA, mass timber

Procedia PDF Downloads 155
13262 E-Learning Platform for School Kids

Authors: Gihan Thilakarathna, Fernando Ishara, Rathnayake Yasith, Bandara A. M. R. Y.

Abstract:

E-learning is a crucial component of intelligent education. Even in the midst of a pandemic, E-learning is becoming increasingly important in the educational system. Several e-learning programs are accessible for students. Here, we decided to create an e-learning framework for children. We've found a few issues that teachers are having with their online classes. When there are numerous students in an online classroom, how does a teacher recognize a student's focus on academics and below-the-surface behaviors? Some kids are not paying attention in class, and others are napping. The teacher is unable to keep track of each and every student. Key challenge in e-learning is online exams. Because students can cheat easily during online exams. Hence there is need of exam proctoring is occurred. In here we propose an automated online exam cheating detection method using a web camera. The purpose of this project is to present an E-learning platform for math education and include games for kids as an alternative teaching method for math students. The game will be accessible via a web browser. The imagery in the game is drawn in a cartoonish style. This will help students learn math through games. Everything in this day and age is moving towards automation. However, automatic answer evaluation is only available for MCQ-based questions. As a result, the checker has a difficult time evaluating the theory solution. The current system requires more manpower and takes a long time to evaluate responses. It's also possible to mark two identical responses differently and receive two different grades. As a result, this application employs machine learning techniques to provide an automatic evaluation of subjective responses based on the keyword provided to the computer as student input, resulting in a fair distribution of marks. In addition, it will save time and manpower. We used deep learning, machine learning, image processing and natural language technologies to develop these research components.

Keywords: math, education games, e-learning platform, artificial intelligence

Procedia PDF Downloads 139
13261 Significant Factors in Agile Manufacturing and the Role of Product Architecture

Authors: Mehrnoosh Askarizadeh

Abstract:

Agile manufacturing concept was first coined by Iacocca institute in 1991 as a new manufacturing paradigm in order to provide and ensure competitiveness in the emerging global manufacturing order. Afterward, a considerable number of studies have been conducted in this area. Reviewing these studies reveals that they mostly focus on agile manufacturing drivers, definition and characteristics but few of them propose practical solutions to achieve it. Agile manufacturing is recommended as a successful paradigm after lean for the 21st manufacturing firms. This competitive concept has been developed in response to the continuously changes and uncertainties in today’s business environment. In order to become an agile competitor, a manufacturing firm should focus on enriching its agility capabilities. These agility capabilities can be categorized into seven groups: proactiveness, customer focus, responsiveness, quickness, flexibility, basic competence and partnership. A manufacturing firm which is aiming at achieving agility should first develop its own appropriate agility strategy. This strategy prioritizes required agility capabilities.

Keywords: agile manufacturing, product architecture, customer focus, responsiveness, quickness, flexibility, basic competence

Procedia PDF Downloads 506
13260 Numerical Modeling of Flow in USBR II Stilling Basin with End Adverse Slope

Authors: Hamidreza Babaali, Alireza Mojtahedi, Nasim Soori, Saba Soori

Abstract:

Hydraulic jump is one of the effective ways of energy dissipation in stilling basins that the ‎energy is highly dissipated by jumping. Adverse slope surface at the end stilling basin is ‎caused to increase energy dissipation and stability of the hydraulic jump. In this study, the adverse slope ‎has been added to end of United States Bureau of Reclamation (USBR) II stilling basin in hydraulic model of Nazloochay dam with scale 1:40, and flow simulated into stilling basin using Flow-3D ‎software. The numerical model is verified by experimental data of water depth in ‎stilling basin. Then, the parameters of water level profile, Froude Number, pressure, air ‎entrainment and turbulent dissipation investigated for discharging 300 m3/s using K-Ɛ and Re-Normalization Group (RNG) turbulence ‎models. The results showed a good agreement between numerical and experimental model‎ as ‎numerical model can be used to optimize of stilling basins.‎

Keywords: experimental and numerical modelling, end adverse slope, flow ‎parameters, USBR II stilling basin

Procedia PDF Downloads 162
13259 Design of a Small and Medium Enterprise Growth Prediction Model Based on Web Mining

Authors: Yiea Funk Te, Daniel Mueller, Irena Pletikosa Cvijikj

Abstract:

Small and medium enterprises (SMEs) play an important role in the economy of many countries. When the overall world economy is considered, SMEs represent 95% of all businesses in the world, accounting for 66% of the total employment. Existing studies show that the current business environment is characterized as highly turbulent and strongly influenced by modern information and communication technologies, thus forcing SMEs to experience more severe challenges in maintaining their existence and expanding their business. To support SMEs at improving their competitiveness, researchers recently turned their focus on applying data mining techniques to build risk and growth prediction models. However, data used to assess risk and growth indicators is primarily obtained via questionnaires, which is very laborious and time-consuming, or is provided by financial institutes, thus highly sensitive to privacy issues. Recently, web mining (WM) has emerged as a new approach towards obtaining valuable insights in the business world. WM enables automatic and large scale collection and analysis of potentially valuable data from various online platforms, including companies’ websites. While WM methods have been frequently studied to anticipate growth of sales volume for e-commerce platforms, their application for assessment of SME risk and growth indicators is still scarce. Considering that a vast proportion of SMEs own a website, WM bears a great potential in revealing valuable information hidden in SME websites, which can further be used to understand SME risk and growth indicators, as well as to enhance current SME risk and growth prediction models. This study aims at developing an automated system to collect business-relevant data from the Web and predict future growth trends of SMEs by means of WM and data mining techniques. The envisioned system should serve as an 'early recognition system' for future growth opportunities. In an initial step, we examine how structured and semi-structured Web data in governmental or SME websites can be used to explain the success of SMEs. WM methods are applied to extract Web data in a form of additional input features for the growth prediction model. The data on SMEs provided by a large Swiss insurance company is used as ground truth data (i.e. growth-labeled data) to train the growth prediction model. Different machine learning classification algorithms such as the Support Vector Machine, Random Forest and Artificial Neural Network are applied and compared, with the goal to optimize the prediction performance. The results are compared to those from previous studies, in order to assess the contribution of growth indicators retrieved from the Web for increasing the predictive power of the model.

Keywords: data mining, SME growth, success factors, web mining

Procedia PDF Downloads 248
13258 Technological and Economic Investigation of Concentrated Photovoltaic and Thermal Systems: A Case Study of Iran

Authors: Moloud Torkandam

Abstract:

Any cities must be designed and built in a way that minimizes their need for fossil fuel. Undoubtedly, the necessity of accepting this principle in the previous eras is undeniable with respect to the mode of constructions. Perhaps only due to the great diversity of materials and new technologies in the contemporary era, such a principle in buildings has been forgotten. The question of optimizing energy consumption in buildings has attracted a great deal of attention in many countries and, in this way, they have been able to cut down the consumption of energy up to 30 percent. The energy consumption is remarkably higher than global standards in our country, and the most important reason is the undesirable state of buildings from the standpoint of energy consumption. In addition to providing the means to protect the natural and fuel resources for the future generations, reducing the use of fossil energies may also bring about desirable outcomes such as the decrease in greenhouse gases (whose emissions cause global warming, the melting of polar ice, the rise in sea level and the climatic changes of the planet earth), the decrease in the destructive effects of contamination in residential complexes and especially urban environments and preparation for national self-sufficiency and the country’s independence and preserving national capitals. This research realize that in this modern day and age, living sustainably is a pre-requisite for ensuring a bright future and high quality of life. In acquiring this living standard, we will maintain the functions and ability of our environment to serve and sustain our livelihoods. Electricity is now an integral part of modern life, a basic necessity. In the provision of electricity, we are committed to respecting the environment by reducing the use of fossil fuels through the use of proven technologies that use local renewable and natural resources as its energy source. As far as this research concerned it is completely necessary to work on different type of energy producing such as solar and CPVT system.

Keywords: energy, photovoltaic, termal system, solar energy, CPVT

Procedia PDF Downloads 68
13257 Organizational Learning, Job Satisfaction and Work Performance among Nurses

Authors: Rafia Rafique, Arifa Khadim

Abstract:

This research investigates the moderating role of job satisfaction between organizational learning and work performance among nurses. Correlation research design was used. Non-probability purposive sampling technique was utilized to recruit a sample of 110 nurses from public hospitals situated in the city of Lahore. The construct of organizational learning was measured using subscale of Integrated Scale for Measuring Organizational Learning. Job satisfaction was measured with the help of Job Satisfaction Survey. Performance of employees (task performance, contextual performance and counterproductive work behavior) was assessed by Individual Work Performance Questionnaire. Job satisfaction negatively moderates the relationship between organizational learning and counterproductive work behavior. Education has a significant positive relationship with organizational learning. Age, current hospital experience, marital satisfaction and salary of the nurses have positive relationship while number of children has significant negative relationship with counterproductive work behavior. These outcomes can be insightful in understanding the dynamics involved in work performance. Based on the result of this study relevant solutions can be proposed to improve the work performance of nurses.

Keywords: counterproductive work behavior, nurses, organizational learning, work performance

Procedia PDF Downloads 424
13256 A Numerical Model for Simulation of Blood Flow in Vascular Networks

Authors: Houman Tamaddon, Mehrdad Behnia, Masud Behnia

Abstract:

An accurate study of blood flow is associated with an accurate vascular pattern and geometrical properties of the organ of interest. Due to the complexity of vascular networks and poor accessibility in vivo, it is challenging to reconstruct the entire vasculature of any organ experimentally. The objective of this study is to introduce an innovative approach for the reconstruction of a full vascular tree from available morphometric data. Our method consists of implementing morphometric data on those parts of the vascular tree that are smaller than the resolution of medical imaging methods. This technique reconstructs the entire arterial tree down to the capillaries. Vessels greater than 2 mm are obtained from direct volume and surface analysis using contrast enhanced computed tomography (CT). Vessels smaller than 2mm are reconstructed from available morphometric and distensibility data and rearranged by applying Murray’s Laws. Implementation of morphometric data to reconstruct the branching pattern and applying Murray’s Laws to every vessel bifurcation simultaneously, lead to an accurate vascular tree reconstruction. The reconstruction algorithm generates full arterial tree topography down to the first capillary bifurcation. Geometry of each order of the vascular tree is generated separately to minimize the construction and simulation time. The node-to-node connectivity along with the diameter and length of every vessel segment is established and order numbers, according to the diameter-defined Strahler system, are assigned. During the simulation, we used the averaged flow rate for each order to predict the pressure drop and once the pressure drop is predicted, the flow rate is corrected to match the computed pressure drop for each vessel. The final results for 3 cardiac cycles is presented and compared to the clinical data.

Keywords: blood flow, morphometric data, vascular tree, Strahler ordering system

Procedia PDF Downloads 257
13255 FlexPoints: Efficient Algorithm for Detection of Electrocardiogram Characteristic Points

Authors: Daniel Bulanda, Janusz A. Starzyk, Adrian Horzyk

Abstract:

The electrocardiogram (ECG) is one of the most commonly used medical tests, essential for correct diagnosis and treatment of the patient. While ECG devices generate a huge amount of data, only a small part of them carries valuable medical information. To deal with this problem, many compression algorithms and filters have been developed over the past years. However, the rapid development of new machine learning techniques poses new challenges. To address this class of problems, we created the FlexPoints algorithm that searches for characteristic points on the ECG signal and ignores all other points that do not carry relevant medical information. The conducted experiments proved that the presented algorithm can significantly reduce the number of data points which represents ECG signal without losing valuable medical information. These sparse but essential characteristic points (flex points) can be a perfect input for some modern machine learning models, which works much better using flex points as an input instead of raw data or data compressed by many popular algorithms.

Keywords: characteristic points, electrocardiogram, ECG, machine learning, signal compression

Procedia PDF Downloads 151
13254 Flexible Ethylene-Propylene Copolymer Nanofibers Decorated with Ag Nanoparticles as Effective 3D Surface-Enhanced Raman Scattering Substrates

Authors: Yi Li, Rui Lu, Lianjun Wang

Abstract:

With the rapid development of chemical industry, the consumption of volatile organic compounds (VOCs) has increased extensively. In the process of VOCs production and application, plenty of them have been transferred to environment. As a result, it has led to pollution problems not only in soil and ground water but also to human beings. Thus, it is important to develop a sensitive and cost-effective analytical method for trace VOCs detection in environment. Surface-enhanced Raman Spectroscopy (SERS), as one of the most sensitive optical analytical technique with rapid response, pinpoint accuracy and noninvasive detection, has been widely used for ultratrace analysis. Based on the plasmon resonance on the nanoscale metallic surface, SERS technology can even detect single molecule due to abundant nanogaps (i.e. 'hot spots') on the nanosubstrate. In this work, a self-supported flexible silver nitrate (AgNO3)/ethylene-propylene copolymer (EPM) hybrid nanofibers was fabricated by electrospinning. After an in-situ chemical reduction using ice-cold sodium borohydride as reduction agent, numerous silver nanoparticles were formed on the nanofiber surface. By adjusting the reduction time and AgNO3 content, the morphology and dimension of silver nanoparticles could be controlled. According to the principles of solid-phase extraction, the hydrophobic substance is more likely to partition into the hydrophobic EPM membrane in an aqueous environment while water and other polar components are excluded from the analytes. By the enrichment of EPM fibers, the number of hydrophobic molecules located on the 'hot spots' generated from criss-crossed nanofibers is greatly increased, which further enhances SERS signal intensity. The as-prepared Ag/EPM hybrid nanofibers were first employed to detect common SERS probe molecule (p-aminothiophenol) with the detection limit down to 10-12 M, which demonstrated an excellent SERS performance. To further study the application of the fabricated substrate for monitoring hydrophobic substance in water, several typical VOCs, such as benzene, toluene and p-xylene, were selected as model compounds. The results showed that the characteristic peaks of these target analytes in the mixed aqueous solution could be distinguished even at a concentration of 10-6 M after multi-peaks gaussian fitting process, including C-H bending (850 cm-1), C-C ring stretching (1581 cm-1, 1600 cm-1) of benzene, C-H bending (844 cm-1 ,1151 cm-1), C-C ring stretching (1001 cm-1), CH3 bending vibration (1377 cm-1) of toluene, C-H bending (829 cm-1), C-C stretching (1614 cm-1) of p-xylene. The SERS substrate has remarkable advantages which combine the enrichment capacity from EPM and the Raman enhancement of Ag nanoparticles. Meanwhile, the huge specific surface area resulted from electrospinning is benificial to increase the number of adsoption sites and promotes 'hot spots' formation. In summary, this work provides powerful potential in rapid, on-site and accurate detection of trace VOCs using a portable Raman.

Keywords: electrospinning, ethylene-propylene copolymer, silver nanoparticles, SERS, VOCs

Procedia PDF Downloads 152
13253 Overview and Post Damage Analysis of Nepal Earthquake 2015

Authors: Vipin Kumar Singhal, Rohit Kumar Mittal, Pavitra Ranjan Maiti

Abstract:

Damage analysis is one of the preliminary activities to be done after an earthquake so as to enhance the seismic building design technologies and prevent similar type of failure in future during earthquakes. This research article investigates the damage pattern and most probable reason of failure by observing photographs of seven major buildings collapsed/damaged which were evenly spread over the region during Mw7.8, Nepal earthquake 2015 followed by more than 400 aftershocks of Mw4 with one aftershock reaching a magnitude of Mw7.3. Over 250,000 buildings got damaged, and more than 9000 people got injured in this earthquake. Photographs of these buildings were collected after the earthquake and the cause of failure was estimated along with the severity of damage and comment on the reparability of structure has been made. Based on observations, it was concluded that the damage in reinforced concrete buildings was less compared to masonry structures. The number of buildings damaged was high near Kathmandu region due to high building density in that region. This type of damage analysis can be used as a cost effective and quick method for damage assessment during earthquakes.

Keywords: Nepal earthquake, damage analysis, damage assessment, damage scales

Procedia PDF Downloads 358
13252 Transportation Accidents Mortality Modeling in Thailand

Authors: W. Sriwattanapongse, S. Prasitwattanaseree, S. Wongtrangan

Abstract:

The transportation accidents mortality is a major problem that leads to loss of human lives, and economic. The objective was to identify patterns of statistical modeling for estimating mortality rates due to transportation accidents in Thailand by using data from 2000 to 2009. The data was taken from the death certificate, vital registration database. The number of deaths and mortality rates were computed classifying by gender, age, year and region. There were 114,790 cases of transportation accidents deaths. The highest average age-specific transport accident mortality rate is 3.11 per 100,000 per year in males, Southern region and the lowest average age-specific transport accident mortality rate is 1.79 per 100,000 per year in females, North-East region. Linear, poisson and negative binomial models were chosen for fitting statistical model. Among the models fitted, the best was chosen based on the analysis of deviance and AIC. The negative binomial model was clearly appropriate fitted.

Keywords: transportation accidents, mortality, modeling, analysis of deviance

Procedia PDF Downloads 232
13251 Single Event Transient Tolerance Analysis in 8051 Microprocessor Using Scan Chain

Authors: Jun Sung Go, Jong Kang Park, Jong Tae Kim

Abstract:

As semi-conductor manufacturing technology evolves; the single event transient problem becomes more significant issue. Single event transient has a critical impact on both combinational and sequential logic circuits, so it is important to evaluate the soft error tolerance of the circuits at the design stage. In this paper, we present a soft error detecting simulation using scan chain. The simulation model generates a single event transient randomly in the circuit, and detects the soft error during the execution of the test patterns. We verified this model by inserting a scan chain in an 8051 microprocessor using 65 nm CMOS technology. While the test patterns generated by ATPG program are passing through the scan chain, we insert a single event transient and detect the number of soft errors per sub-module. The experiments show that the soft error rates per cell area of the SFR module is 277% larger than other modules.

Keywords: scan chain, single event transient, soft error, 8051 processor

Procedia PDF Downloads 330
13250 Empirical Study of Running Correlations in Exam Marks: Same Statistical Pattern as Chance

Authors: Weisi Guo

Abstract:

It is well established that there may be running correlations in sequential exam marks due to students sitting in the order of course registration patterns. As such, a random and non-sequential sampling of exam marks is a standard recommended practice. Here, the paper examines a large number of exam data stretching several years across different modules to see the degree to which it is true. Using the real mark distribution as a generative process, it was found that random simulated data had no more sequential randomness than the real data. That is to say, the running correlations that one often observes are statistically identical to chance. Digging deeper, it was found that some high running correlations have students that indeed share a common course history and make similar mistakes. However, at the statistical scale of a module question, the combined effect is statistically similar to the random shuffling of papers. As such, there may not be the need to take random samples for marks, but it still remains good practice to mark papers in a random sequence to reduce the repetitive marking bias and errors.

Keywords: data analysis, empirical study, exams, marking

Procedia PDF Downloads 168
13249 Road Safety and Accident Prevention in Third World Countries: A Case Study of NH-7 in India

Authors: Siddegowda, Y. A. Sathish, G. Krishnegowda, T. M. Mohan Kumar

Abstract:

Road accidents are a human tragedy. They involve high human suffering and monetary costs in terms of untimely death, injuries and social problems. India had earned the dubious distinction of having more number of fatalities due to road accidents in the world. Road safety is emerging as a major social concern around the world especially in India because of infrastructure project works. A case study was taken on NH – 07 which connects to various major cities and industries. The study shows that major cases of fatalities are due to bus, trucks and high speed vehicles. The main causes of accidents are due to high density, non-restriction of speed, use of mobile phones, lack of board signs on road parking, visibility restriction, improper geometric design, road use characteristics, environmental aspects, social aspects etc. Data analysis and preventive measures are enlightened in this paper.

Keywords: accidents, environmental aspects, fatalities, geometric design, road user characteristics

Procedia PDF Downloads 239
13248 Detectability of Malfunction in Turboprop Engine

Authors: Tomas Vampola, Michael Valášek

Abstract:

On the basis of simulation-generated failure states of structural elements of a turboprop engine suitable for the busy-jet class of aircraft, an algorithm for early prediction of damage or reduction in functionality of structural elements of the engine is designed and verified with real data obtained at dynamometric testing facilities of aircraft engines. Based on an expanding database of experimentally determined data from temperature and pressure sensors during the operation of turboprop engines, this strategy is constantly modified with the aim of using the minimum number of sensors to detect an inadmissible or deteriorated operating mode of specific structural elements of an aircraft engine. The assembled algorithm for the early prediction of reduced functionality of the aircraft engine significantly contributes to the safety of air traffic and to a large extent, contributes to the economy of operation with positive effects on the reduction of the energy demand of operation and the elimination of adverse effects on the environment.

Keywords: detectability of malfunction, dynamometric testing, prediction of damage, turboprop engine

Procedia PDF Downloads 81
13247 Soft Robotic Exoskeletal Glove with Single Motor-Driven Tendon-Based Differential Drive

Authors: M. Naveed Akhter, Jawad Aslam, Omer Gillani

Abstract:

To aid and rehabilitate increasing number of patients suffering from spinal cord injury (SCI) and stroke, a lightweight, wearable, and 3D printable exoskeletal glove has been developed. Unlike previously developed metal or fabric-based exoskeletons, this research presents the development of soft exoskeletal glove made of thermoplastic polyurethane (TPU). The drive mechanism consists of a single motor-driven antagonistic tendon to perform extension or flexion of middle and index finger. The tendon-based differential drive has been incorporated to allow for grasping of irregularly shaped objects. The design features easy 3D-printability with TPU without a need for supports. The overall weight of the glove and the actuation unit is approximately 500g. Performance of the glove was tested on a custom test-bench with integrated load cells, and the grip strength was tested to be around 30N per finger while grasping objects of irregular shape.

Keywords: 3D printable, differential drive, exoskeletal glove, rehabilitation, single motor driven

Procedia PDF Downloads 128
13246 Factors Affecting Human Resource Managers Information Behavior

Authors: Sevim Oztimurlenk

Abstract:

This is an exploratory study on the information behavior of human resource managers. This study is conducted by using a questionnaire survey and an interview. The data is gathered from 140 HR managers who are members of the People Management Association of Turkey (PERYÖN), and the 15 interviewees were chosen among those 140 survey participants randomly. The goal of this exploratory study is to investigate the impact of some factors (i.e., gender, age, work experience, number of employee reporting, company size, industry type) on HR managers’ information behavior. More specifically, it examines if there is a relationship between those factors and HR managers’ information behavior in terms of what kind of information sources they consult and reviews and whom they prefer to communicate with for information sharing. It also aims to find out additional factors influencing the information behavior of HR managers. The results of the study show that age and industry type are the two factors affecting the information behavior of HR managers, among other factors investigated in terms of information source, use and share. Moreover, personality, technology, education, organizational culture, and culture are the top five factors among the 24 additional factors suggested by HR managers who participated in this study.

Keywords: information behavior, information use, information source, information share, human resource managers

Procedia PDF Downloads 124
13245 Investigating Optical Properties of Unsaturated Polyurethane Matrix and Its Glass Fiber Composite Under Extreme Temperatures

Authors: Saad Ahmed, Sanjeev Khannaa

Abstract:

Glass fiber reinforced polymers are widely used in structural systems as load-bearing elements at both high and low temperatures. This investigation presents the evaluation of glass fiber reinforced unsaturated polyurethane under harsh conditions of changing temperature and moisture content. This study Explores how these parameters affect the optical properties of the polymer matrix and the composite. Using the hand layup method, the polyurethane resin was modified by E-glass fibers (15 vol. %) to manufacture fiber-reinforced composite. This work includes the preparation of glass-like polyurethane resin sheets and estimates all light transmittance properties at high and very low temperatures and wet conditions. All-optical properties were retested to evaluate the level of improvement or failure. The results found that when comprising reinforced composite fiber to the unreinforced specimens, the reinforced composite shows a fair optical property at high temperatures and good performance at low temperatures.

Keywords: unsaturated polyurethane, extreme temperatures, light transmittance, haze number

Procedia PDF Downloads 134
13244 Teratogenic Effect of Bisphenol A in Development of Balb/C Mouse

Authors: Nazihe Sedighi, Mohsen Nokhbatolphoghaei

Abstract:

Bisphenol A (BPA) is a monomer used in the manufacture of polycarbonate plastics. Due to having properties such as transparency, heat and impact resistance, it is used widely in medicine, sorts, electronic components, and food containers. It is also used in the production of resins which is applied for lining cans. BPA releases from resins and polycarbonate when it is heated or continuously used the containers from which BPA can enter the body. There are several reports indicating the presence of BPA in the placenta, amniotic fluid, and the embryo itself. While researchers investigated the teratogenic effect of BPA on embryos, very limited work has been done on the effects of BPA when applied from early stages of development. In this study, The teratogenic effect of BPA was investigated at earliest preimplantation (day zero) through day 15.5 of the development of Balb/C mouse embryos. After ensuring the pregnancy via observing vaginal plug, Pregnant mice were divided into five groups. For the three experimental groups, the amount of 500, 750, and 1000 mg/kg/d Bisphenol A was given orally according to body weight. The sham group that was treated with sesame oil, which was used as vehicle and control group remained intact. On day 18.5 of gestation, embryos were removed from the uterus. Randomly half of the embryo were fixed in Bouin for tissue analysis. The other half were prepared for skeletal system staining using Alizarin Red and alcian blue dies. The results showed that the embryonic weight and the crown-rump length of embryos decreased significantly (P < 0.05) in all experimental groups compared to the control group and the sham. In this study, skeletal abnormalities such as delay in ossification of skull and limbs as well as the deviation in the backbone were seen. This research suggests that pregnant mothers need to be aware of possible teratogenic effects of BPA at any stage of pregnancy especially from early to mid stages. In this case, pregnant mothers may need to stop using any manufacture of polycarbonate plastics, as a container for food or drinking.

Keywords: bisphenol A, development, polycarbonate plastic, skeletal system, teratogenicity

Procedia PDF Downloads 281
13243 Oil Logistics for Refining to Northern Europe

Authors: Vladimir Klepikov

Abstract:

To develop the programs to supply crude oil to North European refineries, it is necessary to take into account the refineries’ location, crude refining capacity, and the transport infrastructure capacity. Among the countries of the region, we include those having a marine boundary along the Northern Sea and the Baltic Sea (from France in the west to Finland in the east). The paper envisages the geographic allocation of the refineries and contains the evaluation of the refineries’ capacities for the region under review. The sustainable operations of refineries in the region are determined by the transportation system capacity to supply crude oil to them. The assessment of capacity of crude oil transportation to the refineries is conducted. The research is performed for the period of 2005/2015, using the quantitative analysis method. The countries are classified by the refineries’ aggregate capacities and the crude oil output on their territory. The crude oil output capacities in the region in the period under review are determined. The capacities of the region’s transportation system to supply crude oil produced in the region to the refineries are revealed. The analysis suggested that imported raw materials are the main source of oil for the refineries in the region. The main sources of crude oil supplies to North European refineries are reviewed. The change in the refineries’ capacities in the group of countries and each particular country, as well as the utilization of the refineries' capacities in the region in the period under review, was studied. The input suggests that the bulk of crude oil is supplied by marine and pipeline transport. The paper contains the assessment of the crude oil transportation by pipeline transport in the overall crude oil cargo flow. The refineries’ production rate for the groups of countries under the review and for each particular country was the subject of study. Our study yielded the trend towards the increase in the crude oil refining at the refineries of the region and reduction in the crude oil output. If this trend persists in the near future, the cargo flow of imported crude oil and the utilization of the North European logistics infrastructure may increase. According to the study, the existing transport infrastructure in the region is able to handle the increasing imported crude oil flow.

Keywords: European region, infrastructure, oil terminal capacity, pipeline capacity, tanker draft

Procedia PDF Downloads 158
13242 Focusing of Technology Monitoring Activities Using Indicators

Authors: Günther Schuh, Christina König, Toni Drescher

Abstract:

One of the key factors for the competitiveness and market success of technology-driven companies is the timely provision of information about emerging technologies, changes in existing technologies, as well as relevant related changes in the market's structures and participants. Therefore, many companies conduct technology intelligence (TI) activities to ensure an early identification of appropriate technologies and other (weak) signals. One base activity of TI is technology monitoring, which is defined as the systematic tracking of developments within a specified topic of interest as well as related trends over a long period of time. Due to the very large number of dynamically changing parameters within the technological and the market environment of a company as well as their possible interdependencies, it is necessary to focus technology monitoring on specific indicators or other criteria, which are able to point out technological developments and market changes. In addition to the execution of a literature review on existing approaches, which mainly propose patent-based indicators, it is examined in this paper whether indicator systems from other branches such as risk management or economic research could be transferred to technology monitoring in order to enable an efficient and focused technology monitoring for companies.

Keywords: technology forecasting, technology indicator, technology intelligence, technology management, technology monitoring

Procedia PDF Downloads 461
13241 Regression for Doubly Inflated Multivariate Poisson Distributions

Authors: Ishapathik Das, Sumen Sen, N. Rao Chaganty, Pooja Sengupta

Abstract:

Dependent multivariate count data occur in several research studies. These data can be modeled by a multivariate Poisson or Negative binomial distribution constructed using copulas. However, when some of the counts are inflated, that is, the number of observations in some cells are much larger than other cells, then the copula based multivariate Poisson (or Negative binomial) distribution may not fit well and it is not an appropriate statistical model for the data. There is a need to modify or adjust the multivariate distribution to account for the inflated frequencies. In this article, we consider the situation where the frequencies of two cells are higher compared to the other cells, and develop a doubly inflated multivariate Poisson distribution function using multivariate Gaussian copula. We also discuss procedures for regression on covariates for the doubly inflated multivariate count data. For illustrating the proposed methodologies, we present a real data containing bivariate count observations with inflations in two cells. Several models and linear predictors with log link functions are considered, and we discuss maximum likelihood estimation to estimate unknown parameters of the models.

Keywords: copula, Gaussian copula, multivariate distributions, inflated distributios

Procedia PDF Downloads 147
13240 Brain Connectome of Glia, Axons, and Neurons: Cognitive Model of Analogy

Authors: Ozgu Hafizoglu

Abstract:

An analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with physical, behavioral, principal relations that are essential to learning, discovery, and innovation. The Cognitive Model of Analogy (CMA) leads and creates patterns of pathways to transfer information within and between domains in science, just as happens in the brain. The connectome of the brain shows how the brain operates with mental leaps between domains and mental hops within domains and the way how analogical reasoning mechanism operates. This paper demonstrates the CMA as an evolutionary approach to science, technology, and life. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions in the new era, especially post-pandemic. In this paper, we will reveal how to draw an analogy to scientific research to discover new systems that reveal the fractal schema of analogical reasoning within and between the systems like within and between the brain regions. Distinct phases of the problem-solving processes are divided thusly: stimulus, encoding, mapping, inference, and response. Based on the brain research so far, the system is revealed to be relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain’s mechanism in macro context; brain and spinal cord, and micro context: glia and neurons, relative to matching conditions of analogical reasoning and relational information, encoding, mapping, inference and response processes, and verification of perceptual responses in four-term analogical reasoning. Finally, we will relate all these terminologies with these mental leaps, mental maps, mental hops, and mental loops to make the mental model of CMA clear.

Keywords: analogy, analogical reasoning, brain connectome, cognitive model, neurons and glia, mental leaps, mental hops, mental loops

Procedia PDF Downloads 156
13239 Multiphysic Coupling Between Hypersonc Reactive Flow and Thermal Structural Analysis with Ablation for TPS of Space Lunchers

Authors: Margarita Dufresne

Abstract:

This study devoted to development TPS for small space re-usable launchers. We have used SIRIUS design for S1 prototype. Multiphysics coupling for hypersonic reactive flow and thermos-structural analysis with and without ablation is provided by -CCM+ and COMSOL Multiphysics and FASTRAN and ACE+. Flow around hypersonic flight vehicles is the interaction of multiple shocks and the interaction of shocks with boundary layers. These interactions can have a very strong impact on the aeroheating experienced by the flight vehicle. A real gas implies the existence of a gas in equilibrium, non-equilibrium. Mach number ranged from 5 to 10 for first stage flight.The goals of this effort are to provide validation of the iterative coupling of hypersonic physics models in STAR-CCM+ and FASTRAN with COMSOL Multiphysics and ACE+. COMSOL Multiphysics and ACE+ are used for thermal structure analysis to simulate Conjugate Heat Transfer, with Conduction, Free Convection and Radiation to simulate Heat Flux from hypersonic flow. The reactive simulations involve an air chemical model of five species: N, N2, NO, O and O2. Seventeen chemical reactions, involving dissociation and recombination probabilities calculation include in the Dunn/Kang mechanism. Forward reaction rate coefficients based on a modified Arrhenius equation are computed for each reaction. The algorithms employed to solve the reactive equations used the second-order numerical scheme is obtained by a “MUSCL” (Monotone Upstream-cantered Schemes for Conservation Laws) extrapolation process in the structured case. Coupled inviscid flux: AUSM+ flux-vector splitting The MUSCL third-order scheme in STAR-CCM+ provides third-order spatial accuracy, except in the vicinity of strong shocks, where, due to limiting, the spatial accuracy is reduced to second-order and provides improved (i.e., reduced) dissipation compared to the second-order discretization scheme. initial unstructured mesh is refined made using this initial pressure gradient technique for the shock/shock interaction test case. The suggested by NASA turbulence models are the K-Omega SST with a1 = 0.355 and QCR (quadratic) as the constitutive option. Specified k and omega explicitly in initial conditions and in regions – k = 1E-6 *Uinf^2 and omega = 5*Uinf/ (mean aerodynamic chord or characteristic length). We put into practice modelling tips for hypersonic flow as automatic coupled solver, adaptative mesh refinement to capture and refine shock front, using advancing Layer Mesher and larger prism layer thickness to capture shock front on blunt surfaces. The temperature range from 300K to 30 000 K and pressure between 1e-4 and 100 atm. FASTRAN and ACE+ are coupled to provide high-fidelity solution for hot hypersonic reactive flow and Conjugate Heat Transfer. The results of both approaches meet the CIRCA wind tunnel results.

Keywords: hypersonic, first stage, high speed compressible flow, shock wave, aerodynamic heating, conugate heat transfer, conduction, free convection, radiation, fastran, ace+, comsol multiphysics, star-ccm+, thermal protection system (tps), space launcher, wind tunnel

Procedia PDF Downloads 51
13238 Solution of the Nonrelativistic Radial Wave Equation of Hydrogen Atom Using the Green's Function Approach

Authors: F. U. Rahman, R. Q. Zhang

Abstract:

This work aims to develop a systematic numerical technique which can be easily extended to many-body problem. The Lippmann Schwinger equation (integral form of the Schrodinger wave equation) is solved for the nonrelativistic radial wave of hydrogen atom using iterative integration scheme. As the unknown wave function appears on both sides of the Lippmann Schwinger equation, therefore an approximate wave function is used in order to solve the equation. The Green’s function is obtained by the method of Laplace transform for the radial wave equation with excluded potential term. Using the Lippmann Schwinger equation, the product of approximate wave function, the Green’s function and the potential term is integrated iteratively. Finally, the wave function is normalized and plotted against the standard radial wave for comparison. The outcome wave function converges to the standard wave function with the increasing number of iteration. Results are verified for the first fifteen states of hydrogen atom. The method is efficient and consistent and can be applied to complex systems in future.

Keywords: Green’s function, hydrogen atom, Lippmann Schwinger equation, radial wave

Procedia PDF Downloads 381
13237 Healthy Feeding and Drinking Troughs for Profitable Intensive Deep-Litter Poultry Farming

Authors: Godwin Ojochogu Adejo, Evelyn UnekwuOjo Adejo, Sunday UnenwOjo Adejo

Abstract:

The mainstream contemporary approach to controlling the impact of diseases among poultry birds rely largely on curative measures through the administration of drugs to infected birds. Most times as observed in the deep liter poultry farming system, entire flocks including uninfected birds receive the treatment they do not need. As such, unguarded use of chemical drugs and antibiotics has led to wastage and accumulation of chemical residues in poultry products with associated health hazards to humans. However, wanton and frequent drug usage in poultry is avoidable if feeding and drinking equipment are designed to curb infection transmission among birds. Using toxicological assays as guide and with efficiency and simplicity in view, two newly field-tested and recently patented equipments called 'healthy liquid drinking trough (HDT)' and 'healthy feeding trough (HFT)' that systematically eliminate contamination of the feeding and drinking channels, thereby, curbing wide-spread infection and transmission of diseases in the (intensive) deep litter poultry farming system were designed. Upon combined usage, they automatically and drastically reduced both the amount and frequency of antibiotics use in poultry by over > 50%. Additionally, they conferred optimization of feed and water utilization/elimination of wastage by > 80%, reduced labour by > 70%, reduced production cost by about 15%, and reduced chemical residues in poultry meat or eggs by > 85%. These new and cheap technologies which require no energy input are likely to elevate safety of poultry products for consumers' health, increase marketability locally and for export, and increase output and profit especially among poultry farmers and poor people who keep poultry or inevitably utilize poultry products in developing countries.

Keywords: healthy, trough, toxicological, assay-guided, poultry

Procedia PDF Downloads 138
13236 Analysis and Prediction of Fine Particulate Matter in the Air Environment for 2007-2020 in Bangkok Thailand

Authors: Phawichsak Prapassornpitaya, Wanida Jinsart

Abstract:

Daily monitoring PM₁₀ and PM₂.₅ data from 2007 to 2017 were analyzed to provide baseline data for prediction of the air pollution in Bangkok in the period of 2018 -2020. Two statistical models, Autoregressive Integrated Moving Average model (ARIMA) were used to evaluate the trends of pollutions. The prediction concentrations were tested by root means square error (RMSE) and index of agreement (IOA). This evaluation of the traffic PM₂.₅ and PM₁₀ were studied in association with the regulatory control and emission standard changes. The emission factors of particulate matter from diesel vehicles were decreased when applied higher number of euro standard. The trends of ambient air pollutions were expected to decrease. However, the Bangkok smog episode in February 2018 with temperature inversion caused high concentration of PM₂.₅ in the air environment of Bangkok. The impact of traffic pollutants was depended upon the emission sources, temperature variations, and metrological conditions.

Keywords: fine particulate matter, ARIMA, RMSE, Bangkok

Procedia PDF Downloads 259
13235 Transmission Line Protection Challenges under High Penetration of Renewable Energy Sources and Proposed Solutions: A Review

Authors: Melake Kuflom

Abstract:

European power networks involve the use of multiple overhead transmission lines to construct a highly duplicated system that delivers reliable and stable electrical energy to the distribution level. The transmission line protection applied in the existing GB transmission network are normally independent unit differential and time stepped distance protection schemes, referred to as main-1 & main-2 respectively, with overcurrent protection as a backup. The increasing penetration of renewable energy sources, commonly referred as “weak sources,” into the power network resulted in the decline of fault level. Traditionally, the fault level of the GB transmission network has been strong; hence the fault current contribution is more than sufficient to ensure the correct operation of the protection schemes. However, numerous conventional coal and nuclear generators have been or about to shut down due to the societal requirement for CO2 emission reduction, and this has resulted in a reduction in the fault level on some transmission lines, and therefore an adaptive transmission line protection is required. Generally, greater utilization of renewable energy sources generated from wind or direct solar energy results in a reduction of CO2 carbon emission and can increase the system security and reliability but reduces the fault level, which has an adverse effect on protection. Consequently, the effectiveness of conventional protection schemes under low fault levels needs to be reviewed, particularly for future GB transmission network operating scenarios. The proposed paper will evaluate the transmission line challenges under high penetration of renewable energy sources andprovides alternative viable protection solutions based on the problem observed. The paper will consider the assessment ofrenewable energy sources (RES) based on a fully rated converter technology. The DIgSILENT Power Factory software tool will be used to model the network.

Keywords: fault level, protection schemes, relay settings, relay coordination, renewable energy sources

Procedia PDF Downloads 187