Search results for: Comparison Analysis of Science & Technology Council (NSTC)
34635 Reaching New Levels: Using Systems Thinking to Analyse a Major Incident Investigation
Authors: Matthew J. I. Woolley, Gemma J. M. Read, Paul M. Salmon, Natassia Goode
Abstract:
The significance of high consequence, workplace failures within construction continues to resonate with a combined average of 12 fatal incidents occurring daily throughout Australia, the United Kingdom, and the United States. Within the Australian construction domain, more than 35 serious, compensable injury incidents are reported daily. These alarming figures, in conjunction with the continued occurrence of fatal and serious, occupational injury incidents globally suggest existing approaches to incident analysis may not be achieving required injury prevention outcomes. One reason may be that, incident analysis methods used in construction have not kept pace with advances in the field of safety science and are not uncovering the full range system-wide contributory factors that are required to achieve optimal levels of construction safety performance. Another reason underpinning this global issue may also be the absence of information surrounding the construction operating and project delivery system. For example, it is not clear who shares the responsibility for construction safety in different contexts. To respond to this issue, to the author’s best knowledge, a first of its kind, control structure model of the construction industry is presented and then used to analyse a fatal construction incident. The model was developed by applying and extending the Systems Theoretic and Incident Model and Process method to hierarchically represent the actors, constraints, feedback mechanisms, and relationships that are involved in managing construction safety performance. The Causal Analysis based on Systems Theory (CAST) method was then used to identify the control and feedback failures involved in the fatal incident. The conclusions from the Coronial investigation into the event are compared with the findings stemming from the CAST analysis. The CAST analysis highlighted additional issues across the construction system that were not identified in the coroner’s recommendations, suggested there is a potential benefit in applying a systems theory approach to incident analysis in construction. The findings demonstrate the utility applying systems theory-based methods to the analysis of construction incidents. Specifically, this study shows the utility of the construction control structure and the potential benefits for project leaders, construction entities, regulators, and construction clients in controlling construction performance.Keywords: construction project management, construction performance, incident analysis, systems thinking
Procedia PDF Downloads 13134634 Obstacle Classification Method Based on 2D LIDAR Database
Authors: Moohyun Lee, Soojung Hur, Yongwan Park
Abstract:
In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.Keywords: obstacle, classification, database, LIDAR, segmentation, intensity
Procedia PDF Downloads 34934633 The Relationship between Market Orientation, Human Resource Management, Adoption of Information Communication Technology, Performance of Small and Medium Enterprises and Mediating Cash Management
Authors: Azizah Hashim, Rohana Ngah
Abstract:
Transformation of Economic Development is aimed to transform Malaysia to become a high-income developed nation with a knowledge-based economy by 2020. To achieve this national agenda, the country needs to further strengthen its economic development, growth and well-being. Therefore, this study aspires to examine the relationship between market orientation, human resource management and adoption of information communication technology and SMEs performance and cash management as a mediator. This study will employ quantitative approaches. Questionnaires will be distributed to managers and owners in service sectors. The data collected will be analyzed using SPSS and Structural Equation Modelling. Resource Based Theory (RBT) adopts as an integral part of management literature that explains the performance of organizations through building resources and implement of their strategies.Keywords: small medium enterprises (SMEs), market orientation, human resource management, adoption of information communication technology
Procedia PDF Downloads 27734632 Engineering Design of a Chemical Launcher: An Interdisciplinary Design Activity
Authors: Mei Xuan Tan, Gim-Yang Maggie Pee, Mei Chee Tan
Abstract:
Academic performance, in the form of scoring high grades in enrolled subjects, is not the only significant trait in achieving success. Engineering graduates with experience in working on hands-on projects in a team setting are highly sought after in industry upon graduation. Such projects are typically real world problems that require the integration and application of knowledge and skills from several disciplines. In a traditional university setting, subjects are taught in a silo manner with no cross participation from other departments or disciplines. This may lead to knowledge compartmentalization and students are unable to understand and connect the relevance and applicability of the subject. University instructors thus see this integration across disciplines as a challenging task as they aim to better prepare students in understanding and solving problems for work or future studies. To improve students’ academic performance and to cultivate various skills such as critical thinking, there has been a gradual uptake in the use of an active learning approach in introductory science and engineering courses, where lecturing is traditionally the main mode of instruction. This study aims to discuss the implementation and experience of a hands-on, interdisciplinary project that involves all the four core subjects taught during the term at the Singapore University of Technology Design (SUTD). At SUTD, an interdisciplinary design activity, named 2D, is integrated into the curriculum to help students reinforce the concepts learnt. A student enrolled in SUTD experiences his or her first 2D in Term 1. This activity. which spans over one week in Week 10 of Term 1, highlights the application of chemistry, physics, mathematics, humanities, arts and social sciences (HASS) in designing an engineering product solution. The activity theme for Term 1 2D revolved around “work and play”. Students, in teams of 4 or 5, used a scaled-down model of a chemical launcher to launch a projectile across the room. It involved the use of a small chemical combustion reaction between ethanol (a highly volatile fuel) and oxygen. This reaction generated a sudden and large increase in gas pressure built up in a closed chamber, resulting in rapid gas expansion and ejection of the projectile out of the launcher. Students discussed and explored the meaning of play in their lives in HASS class while the engineering aspects of a combustion system to launch an object using underlying principles of energy conversion and projectile motion were revisited during the chemistry and physics classes, respectively. Numerical solutions on the distance travelled by the projectile launched by the chemical launcher, taking into account drag forces, was developed during the mathematics classes. At the end of the activity, students developed skills in report writing, data collection and analysis. Specific to this 2D activity, students gained an understanding and appreciation on the application and interdisciplinary nature of science, engineering and HASS. More importantly, students were exposed to design and problem solving, where human interaction and discussion are important yet challenging in a team setting.Keywords: active learning, collaborative learning, first year undergraduate, interdisciplinary, STEAM
Procedia PDF Downloads 12234631 Exploration of RFID in Healthcare: A Data Mining Approach
Authors: Shilpa Balan
Abstract:
Radio Frequency Identification, also popularly known as RFID is used to automatically identify and track tags attached to items. This study focuses on the application of RFID in healthcare. The adoption of RFID in healthcare is a crucial technology to patient safety and inventory management. Data from RFID tags are used to identify the locations of patients and inventory in real time. Medical errors are thought to be a prominent cause of loss of life and injury. The major advantage of RFID application in healthcare industry is the reduction of medical errors. The healthcare industry has generated huge amounts of data. By discovering patterns and trends within the data, big data analytics can help improve patient care and lower healthcare costs. The number of increasing research publications leading to innovations in RFID applications shows the importance of this technology. This study explores the current state of research of RFID in healthcare using a text mining approach. No study has been performed yet on examining the current state of RFID research in healthcare using a data mining approach. In this study, related articles were collected on RFID from healthcare journal and news articles. Articles collected were from the year 2000 to 2015. Significant keywords on the topic of focus are identified and analyzed using open source data analytics software such as Rapid Miner. These analytical tools help extract pertinent information from massive volumes of data. It is seen that the main benefits of adopting RFID technology in healthcare include tracking medicines and equipment, upholding patient safety, and security improvement. The real-time tracking features of RFID allows for enhanced supply chain management. By productively using big data, healthcare organizations can gain significant benefits. Big data analytics in healthcare enables improved decisions by extracting insights from large volumes of data.Keywords: RFID, data mining, data analysis, healthcare
Procedia PDF Downloads 23334630 Bioelectrochemical System: An Alternative Technology for Metal Removal from Industrial Wastewater and Factors Affecting Its Efficiency
Authors: A. G. More
Abstract:
Bioelectrochemical system (BES) is an alternative technology for chromium Cr (VI) removal from industrial wastewater to overcome the existing drawbacks of high chemical and energy consumption by conventional metal removal technologies. A well developed anaerobic sludge was developed in laboratory and used in the batch study of BES at different Cr (VI) concentrations (10, 20, 50, and 50 mg/L) with different COD concentrations (500, 1000, 1500 and 2000 mg/L). Sodium acetate was used as carbon source, whereas Cr (VI) contaminated synthetic wastewater was prepared and added to the cathode chamber. Initially, operating conditions for the BES experiments were optimized. During the study, optimum cathode pH of 2, whereas optimum HRT of 72 hr was obtained. During the study, cathode pH 2 ± 0.1 showed maximum chromium removal efficicency (CRE) of 88.36 ± 8.16% as compared to other pH (1-7) in the cathode chamber. Maximum CRE obtained was 85.93 ± 9.62% at 40°C within the temperature range of 25°C to 45°C. Conducting the BES experiments at optimized operating conditions, CRE of 90.2 %, 93.7 %, 83.75 % and 74.6 % were obtained at cathodic Cr concentration of 10, 20, 50, and 50 mg/L, respectively. BES is a sustainable, energy efficient technology which can be suitably used for metal removal from industrial wastewater.Keywords: bioelectrochemical system, metal removal, microorganisms, pH and temperature, substrate
Procedia PDF Downloads 13534629 Field Tests and Numerical Simulation of Tunis Soft Soil Improvement Using Prefabricated Vertical Drains
Authors: Marwa Ben Khalifa, Zeineb Ben Salem, Wissem Frikha
Abstract:
This paper presents a case study of “Radès la Goulette” bridge project using the technique of prefabricated vertical drains (PVD) associated with step by step construction of preloading embankments with averaged height of about 6 m. These embankments are founded on a highly compressible layer of Tunis soft soil. The construction steps included extensive soil instrumentation such as piezometers and settlement plates for monitoring the dissipation of excess pore water pressures and settlement during the consolidation of Tunis soft soil. An axisymmetric numerical model using the 2D finite difference code FLAC was developed and calibrated using laboratory tests to predict the soil behavior and consolidation settlements. The constitutive model impact for simulating the soft soil behavior is investigated. The results of analyses show that numerical analysis provided satisfactory predictions for the field performance during the construction of Radès la Goulette embankment. The obtained results show the effectiveness of PVD in the acceleration of the consolidation time. A comparison of numerical results with theoretical analysis was presented.Keywords: tunis soft soil, radès bridge project, prefabricated vertical drains, FLAC, acceleration of consolidation
Procedia PDF Downloads 12334628 Identification of ω-3 Fatty Acids Using GC-MS Analysis in Extruded Spelt Product
Authors: Jelena Filipovic, Marija Bodroza-Solarov, Milenko Kosutic, Nebojsa Novkovic, Vladimir Filipovic, Vesna Vucurovic
Abstract:
Spelt wheat is suitable raw material for extruded products such as pasta, special types of bread and other products of altered nutritional characteristics compared to conventional wheat products. During the process of extrusion, spelt is exposed to high temperature and high pressure, during which raw material is also mechanically treated by shear forces. Spelt wheat is growing without the use of pesticides in harsh ecological conditions and in marginal areas of cultivation. So it can be used for organic and health safe food. Pasta is the most popular foodstuff; its consumption has been observed to rise. Pasta quality depends mainly on the properties of flour raw materials, especially protein content and its quality but starch properties are of a lesser importance. Pasta is characterized by significant amounts of complex carbohydrates, low sodium, total fat fiber, minerals, and essential fatty acids and its nutritional value can be improved with additional functional component. Over the past few decades, wheat pasta has been successfully formulated using different ingredients in pasta to cater health-conscious consumers who prefer having a product rich in protein, healthy lipids and other health benefits. Flaxseed flour is used in the production of bakery and pasta products that have properties of functional foods. However, it should be taken into account that food products retain the technological and sensory quality despite the added flax seed. Flaxseed contains important substances in its composition such as vitamins and minerals elements, and it is also an excellent source of fiber and one of the best sources of ω-3 fatty acids and lignin. In this paper, the quality and identification of spelt extruded product with the addition of flax seed, which is positively contributing to the nutritive and technology changes of the product, is investigated. ω-3 fatty acids are polyunsaturated essential fatty acids, and they must be taken with food to satisfy the recommended daily intake. Flaxseed flour is added in the quantity of 10/100 g of sample and 20/100 g of sample on farina. It is shown that the presence of ω-3 fatty acids in pasta can be clearly distinguished from other fatty acids by gas chromatography with mass spectrometry. Addition of flax seed flour influence chemical content of pasta. The addition of flax seed flour in spelt pasta in the quantities of 20g/100 g significantly increases the share of ω-3 fatty acids, which results in improved ratio of ω-6/ω-3 1:2.4 and completely satisfies minimum daily needs of ω-3 essential fatty acids (3.8 g/100 g) recommended by FDA. Flex flour influenced the pasta quality by increasing of hardness (2377.8 ± 13.3; 2874.5 ± 7.4; 3076.3 ± 5.9) and work of shear (102.6 ± 11.4; 150.8 ± 11.3; 165.0 ± 18.9) and increasing of adhesiveness (11.8 ± 20.6; 9.,98 ± 0.12; 7.1 ± 12.5) of the final product. Presented data point at good indicators of technological quality of spelt pasta with flax seed and that GC-MS analysis can be used in the quality control for flax seed identification. Acknowledgment: The research was financed by the Ministry of Education and Science of the Republic of Serbia (Project No. III 46005).Keywords: GC-MS analysis, ω-3 fatty acids, flex seed, spelt wheat, daily needs
Procedia PDF Downloads 16234627 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook
Authors: Chien-Jen Liu, Shu Ching Yang
Abstract:
Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness
Procedia PDF Downloads 34634626 Acceleration of Lagrangian and Eulerian Flow Solvers via Graphics Processing Units
Authors: Pooya Niksiar, Ali Ashrafizadeh, Mehrzad Shams, Amir Hossein Madani
Abstract:
There are many computationally demanding applications in science and engineering which need efficient algorithms implemented on high performance computers. Recently, Graphics Processing Units (GPUs) have drawn much attention as compared to the traditional CPU-based hardware and have opened up new improvement venues in scientific computing. One particular application area is Computational Fluid Dynamics (CFD), in which mature CPU-based codes need to be converted to GPU-based algorithms to take advantage of this new technology. In this paper, numerical solutions of two classes of discrete fluid flow models via both CPU and GPU are discussed and compared. Test problems include an Eulerian model of a two-dimensional incompressible laminar flow case and a Lagrangian model of a two phase flow field. The CUDA programming standard is used to employ an NVIDIA GPU with 480 cores and a C++ serial code is run on a single core Intel quad-core CPU. Up to two orders of magnitude speed up is observed on GPU for a certain range of grid resolution or particle numbers. As expected, Lagrangian formulation is better suited for parallel computations on GPU although Eulerian formulation represents significant speed up too.Keywords: CFD, Eulerian formulation, graphics processing units, Lagrangian formulation
Procedia PDF Downloads 41734625 Investor Sentiment and Satisfaction in Automated Investment: A Sentimental Analysis of Robo-Advisor Platforms
Authors: Vertika Goswami, Gargi Sharma
Abstract:
The rapid evolution of fintech has led to the rise of robo-advisor platforms that utilize artificial intelligence (AI) and machine learning to offer personalized investment solutions efficiently and cost-effectively. This research paper conducts a comprehensive sentiment analysis of investor experiences with these platforms, employing natural language processing (NLP) and sentiment classification techniques. The study investigates investor perceptions, engagement, and satisfaction, identifying key drivers of positive sentiment such as clear communication, low fees, consistent returns, and robust security. Conversely, negative sentiment is linked to issues like inconsistent performance, hidden fees, poor customer support, and a lack of transparency. The analysis reveals that addressing these pain points—through improved transparency, enhanced customer service, and ongoing technological advancements—can significantly boost investor trust and satisfaction. This paper contributes valuable insights into the fields of behavioral finance and fintech innovation, offering actionable recommendations for stakeholders, practitioners, and policymakers. Future research should explore the long-term impact of these factors on investor loyalty, the role of emerging technologies, and the effects of ethical investment choices and regulatory compliance on investor sentiment.Keywords: artificial intelligence in finance, automated investment, financial technology, investor satisfaction, investor sentiment, robo-advisors, sentimental analysis
Procedia PDF Downloads 1834624 Statistical Comparison of Machine and Manual Translation: A Corpus-Based Study of Gone with the Wind
Authors: Yanmeng Liu
Abstract:
This article analyzes and compares the linguistic differences between machine translation and manual translation, through a case study of the book Gone with the Wind. As an important carrier of human feeling and thinking, the literature translation poses a huge difficulty for machine translation, and it is supposed to expose distinct translation features apart from manual translation. In order to display linguistic features objectively, tentative uses of computerized and statistical evidence to the systematic investigation of large scale translation corpora by using quantitative methods have been deployed. This study compiles bilingual corpus with four versions of Chinese translations of the book Gone with the Wind, namely, Piao by Chunhai Fan, Piao by Huairen Huang, translations by Google Translation and Baidu Translation. After processing the corpus with the software of Stanford Segmenter, Stanford Postagger, and AntConc, etc., the study analyzes linguistic data and answers the following questions: 1. How does the machine translation differ from manual translation linguistically? 2. Why do these deviances happen? This paper combines translation study with the knowledge of corpus linguistics, and concretes divergent linguistic dimensions in translated text analysis, in order to present linguistic deviances in manual and machine translation. Consequently, this study provides a more accurate and more fine-grained understanding of machine translation products, and it also proposes several suggestions for machine translation development in the future.Keywords: corpus-based analysis, linguistic deviances, machine translation, statistical evidence
Procedia PDF Downloads 14534623 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —
Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno
Abstract:
STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.Keywords: rule induction, decision table, missing data, noise
Procedia PDF Downloads 39634622 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery
Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong
Abstract:
The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition
Procedia PDF Downloads 29034621 The Effectiveness of Non-surgical Treatment for Androgenetic Alopecia in Men: A Systematic Review and Meta-Analysis
Authors: Monica Trifitriana, Rido Mulawarman
Abstract:
Introduction: Androgenetic alopecia (AGA) is a genetically predetermined disorder due to an excessive response to dihydrotestosterone (DHT). Currently, non-surgical treatment of androgenetic alopecia is more in demand by the patient. There are many non-surgical treatments, ranging from topical treatments oral medications, and procedure treatments. Objective: We aim to assess the latest evidence of the efficacy of non-surgical treatments of androgenetic alopecia in men in comparison to placebo for improving hair density, thickness, and growth. Method: We performed a comprehensive search on topics that assess non-surgical treatments of androgenetic alopecia in men from inception up until November 2021. Result: There were 24 studies out of a total of 2438 patients divided into five non-surgical treatment groups to assess the effectiveness of hair growth, namely: minoxidil 2% (MD: 8.11 hairs/cm²), minoxidil 5% (MD: 12.02 hairs/cm²), low-level laser light therapy/LLLT (MD: 12.35 hairs/cm²), finasteride 1mg (MD: 20.43 hairs/cm²), and Platelete-Rich Plasma/PRP with microneedling (MD: 26.33 hairs/cm²). All treatments had significant results for increasing hair growth, particularly in cases of androgenetic alopecia in men (P<0.00001). Conclusion: From the results, it was found that the five non-surgical treatment groups proved to be effective and significant for hair growth, particularly in cases of androgenetic alopecia in men. In order of the best non-surgical treatment for hair growth is starting from PRP with microneedling, Finasteride 1mg, LLLT, minoxidil 5%, to minoxidil 2%.Keywords: androgenetic alopecia, non-surgical, men, meta-analysis, systematic review
Procedia PDF Downloads 16034620 Euthanasia with Reference to Defective Newborns: An Analysis
Authors: Nibedita Priyadarsini
Abstract:
It is said that Ethics has a wide range of application which mainly deals with human life and human behavior. All ethical decisions are ultimately concerned with life and death. Both life and death must be considered dignified. Medical ethics with its different topics mostly deals with life and death concepts among which euthanasia is one. Various types of debates continue over Euthanasia long since. The question of putting an end to someone’s life has aroused controversial in legal sphere as well as in moral sphere. To permit or not to permit has remained an enigma the world over. Modern medicine is in the stage of transcending limits that cannot be set aside. The morality of allowing people to die without treatment has become more important as methods of treatment have become more sophisticated. Allowing someone to die states an essential recognition that there is some point in any terminal illness when further curative treatment has no purpose and the patient in such situation should allow dying a natural death in comfort, peace, and dignity, without any interference from medical science and technology. But taking a human life is in general sense is illogical in itself. It can be said that when we kill someone, we cause the death; whereas if we merely let someone die, then we will not be responsible for anyone’s death. This point is often made in connection with the euthanasia cases and which is often debatable. Euthanasia in the pediatric age group involves some important issues that are different from those of adult issues. The main distinction that occurs is that the infants and newborns and young children are not able to decide about their future as the adult does. In certain cases, where the child born with some serious deformities with no hope of recovery, in that cases doctor decide not to perform surgery in order to remove the blockage, and let the baby die. Our aim in this paper is to examine, whether it is ethically justified to withhold or to apply euthanasia on the part of the defective infant. What to do with severely defective infants from earliest time if got to know that they are not going to survive at all? Here, it will deal mostly with the ethics in deciding the relevant ethical concerns in the practice of euthanasia with the defective newborns issues. Some cases in relation to disabled infants and newborn baby will be taken in order to show what to do in a critical condition, that the patient and family members undergoes and under which condition those could be eradicated, if not all but some. The final choice must be with the benefit of the patient.Keywords: ethics, medical ethics, euthanasia, defective newborns
Procedia PDF Downloads 20434619 Redox-labeled Electrochemical Aptasensor Array for Single-cell Detection
Authors: Shuo Li, Yannick Coffinier, Chann Lagadec, Fabrizio Cleri, Katsuhiko Nishiguchi, Akira Fujiwara, Soo Hyeon Kim, Nicolas Clément
Abstract:
The need for single cell detection and analysis techniques has increased in the past decades because of the heterogeneity of individual living cells, which increases the complexity of the pathogenesis of malignant tumors. In the search for early cancer detection, high-precision medicine and therapy, the technologies most used today for sensitive detection of target analytes and monitoring the variation of these species are mainly including two types. One is based on the identification of molecular differences at the single-cell level, such as flow cytometry, fluorescence-activated cell sorting, next generation proteomics, lipidomic studies, another is based on capturing or detecting single tumor cells from fresh or fixed primary tumors and metastatic tissues, and rare circulating tumors cells (CTCs) from blood or bone marrow, for example, dielectrophoresis technique, microfluidic based microposts chip, electrochemical (EC) approach. Compared to other methods, EC sensors have the merits of easy operation, high sensitivity, and portability. However, despite various demonstrations of low limits of detection (LOD), including aptamer sensors, arrayed EC sensors for detecting single-cell have not been demonstrated. In this work, a new technique based on 20-nm-thick nanopillars array to support cells and keep them at ideal recognition distance for redox-labeled aptamers grafted on the surface. The key advantages of this technology are not only to suppress the false positive signal arising from the pressure exerted by all (including non-target) cells pushing on the aptamers by downward force but also to stabilize the aptamer at the ideal hairpin configuration thanks to a confinement effect. With the first implementation of this technique, a LOD of 13 cells (with5.4 μL of cell suspension) was estimated. In further, the nanosupported cell technology using redox-labeled aptasensors has been pushed forward and fully integrated into a single-cell electrochemical aptasensor array. To reach this goal, the LOD has been reduced by more than one order of magnitude by suppressing parasitic capacitive electrochemical signals by minimizing the sensor area and localizing the cells. Statistical analysis at the single-cell level is demonstrated for the recognition of cancer cells. The future of this technology is discussed, and the potential for scaling over millions of electrodes, thus pushing further integration at sub-cellular level, is highlighted. Despite several demonstrations of electrochemical devices with LOD of 1 cell/mL, the implementation of single-cell bioelectrochemical sensor arrays has remained elusive due to their challenging implementation at a large scale. Here, the introduced nanopillar array technology combined with redox-labeled aptamers targeting epithelial cell adhesion molecule (EpCAM) is perfectly suited for such implementation. Combining nanopillar arrays with microwells determined for single cell trapping directly on the sensor surface, single target cells are successfully detected and analyzed. This first implementation of a single-cell electrochemical aptasensor array based on Brownian-fluctuating redox species opens new opportunities for large-scale implementation and statistical analysis of early cancer diagnosis and cancer therapy in clinical settings.Keywords: bioelectrochemistry, aptasensors, single-cell, nanopillars
Procedia PDF Downloads 11734618 Aiming at Optimization of Tracking Technology through Seasonally Tilted Sun Trackers: An Indian Perspective
Authors: Sanjoy Mukherjee
Abstract:
Discussions on concepts of Single Axis Tracker (SAT) are becoming more and more apt for developing countries like India not just as an advancement in racking technology but due to the utmost necessity of reaching at the lowest Levelized Cost of Energy (LCOE) targets. With this increasing competition and significant fall in feed-in tariffs of solar PV projects, developers are under constant pressure to secure investment for their projects and eventually earn profits from them. Moreover, being the second largest populated country, India suffers from scarcity of land because of higher average population density. So, to mitigate the risk of this dual edged sword with reducing trend of unit (kWh) cost at one side and utilization of land on the other, tracking evolved as the call of the hour. Therefore, the prime objectives of this paper are not only to showcase how STT proves to be an effective mechanism to get more gain in Global Incidence in collector plane (Ginc) with respect to traditional mounting systems but also to introduce Seasonally Tilted Tracker (STT) technology as a possible option for high latitude locations.Keywords: tracking system, grid connected solar PV plant, CAPEX reduction, levelized cost of energy
Procedia PDF Downloads 25734617 Factors Affecting Students' Attitude to Adapt E-Learning: A Case from Iran How to Develop Virtual Universities in Iran: Using Technology Acceptance Model
Authors: Fatemeh Keivanifard
Abstract:
E-learning is becoming increasingly prominent in higher education, with universities increasing provision and more students signing up. This paper examines factors that predict students' attitudes to adapt e-learning at the Khuzestan province Iran. Understanding the nature of these factors may assist these universities in promoting the use of information and communication technology in teaching and learning. The main focus of the paper is on the university students, whose decision supports effective implementation of e-learning. Data was collected through a survey of 300 post graduate students at the University of dezful, shooshtar and chamran in Khuzestan. The technology adoption model put forward by Davis is utilized in this study. Two more independent variables are added to the original model, namely, the pressure to act and resources availability. The results show that there are five factors that can be used in modeling students' attitudes to adapt e-learning. These factors are intention toward e-learning, perceived usefulness of e-learning, perceived ease of e-learning use, pressure to use e-learning, and the availability of resources needed to use e-learning.Keywords: e-learning, intention, ease of use, pressure to use, usefulness
Procedia PDF Downloads 36834616 Software Quality Assurance in 5G Technology-Redefining Wireless Communication: A Comprehensive Survey
Authors: Sumbal Riaz, Sardar-un-Nisa, Mehreen Sirshar
Abstract:
5G - The 5th generation of mobile phone and data communication standards is the next edge of innovation for whole mobile industry. 5G is Real Wireless World System and it will provide a totally wireless communication system all over the world without limitations. 5G uses many 4g technologies and it will hit the market in 2020. This research is the comprehensive survey on the quality parameters of 5G technology.5G provide High performance, Interoperability, easy roaming, fully converged services, friendly interface and scalability at low cost. To meet the traffic demands in future fifth generation wireless communications systems will include i) higher densification of heterogeneous networks with massive deployment of small base stations supporting various Radio Access Technologies (RATs), ii) use of massive Multiple Input Multiple Output (MIMO) arrays, iii) use of millimetre Wave spectrum where larger wider frequency bands are available, iv) direct device to device (D2D) communication, v) simultaneous transmission and reception, vi) cognitive radio technology.Keywords: 5G, 5th generation, innovation, standard, wireless communication
Procedia PDF Downloads 44434615 The Role of Environmental Analysis in Managing Knowledge in Small and Medium Sized Enterprises
Authors: Liu Yao, B. T. Wan Maseri, Wan Mohd, B. T. Nurul Izzah, Mohd Shah, Wei Wei
Abstract:
Effectively managing knowledge has become a vital weapon for businesses to survive or to succeed in the increasingly competitive market. But do they perform environmental analysis when managing knowledge? If yes, how is the level and significance? This paper established a conceptual framework covering the basic knowledge management activities (KMA) to examine their contribution towards organizational performance (OP). Environmental analysis (EA) was then investigated from both internal and external aspects, to identify its effects on that contribution. Data was collected from 400 Chinese SMEs by questionnaires. Cronbach's α and factor analysis were conducted. Regression results show that the external analysis presents higher level than internal analysis. However, the internal analysis mediates the effects of external analysis on the KMA-OP relation and plays more significant role in the relation comparing with the external analysis. Thus, firms shall improve environmental analysis especially the internal analysis to enhance their KM practices.Keywords: knowledge management, environmental analysis, performance, mediating, small sized enterprises, medium sized enterprises
Procedia PDF Downloads 61534614 Analysis of the Omnichannel Delivery Network with Application to Last Mile Delivery
Authors: Colette Malyack, Pius Egbelu
Abstract:
Business-to-Customer (B2C) delivery options have improved to meet increased demand in recent years. The change in end users has forced logistics networks to focus on customer service and sentiment that would have previously been the priority of the company or organization of origin. This has led to increased pressure on logistics companies to extend traditional B2B networks into a B2C solution while accommodating additional costs, roadblocks, and customer sentiment; the result has been the creation of the omnichannel delivery network encompassing a number of traditional and modern methods of package delivery. In this paper the many solutions within the omnichannel delivery network are defined and discussed. It can be seen through this analysis that the omnichannel delivery network can be applied to reduce the complexity of package delivery and provide customers with more options. Applied correctly the result is a reduction in cost to the logistics company over time, even with an initial increase in cost to obtain the technology.Keywords: network planning, last mile delivery, omnichannel delivery network, omnichannel logistics
Procedia PDF Downloads 15034613 Advanced Analytical Competency Is Necessary for Strategic Leadership to Achieve High-Quality Decision-Making
Authors: Amal Mohammed Alqahatni
Abstract:
This paper is a non-empirical analysis of existing literature on digital leadership competency, data-driven organizations, and dealing with AI technology (big data). This paper will provide insights into the importance of developing the leader’s analytical skills and style to be more effective for high-quality decision-making in a data-driven organization and achieve creativity during the organization's transformation to be digitalized. Despite the enormous potential that big data has, there are not enough experts in the field. Many organizations faced an issue with leadership style, which was considered an obstacle to organizational improvement. It investigates the obstacles to leadership style in this context and the challenges leaders face in coaching and development. The leader's lack of analytical skill with AI technology, such as big data tools, was noticed, as was the lack of understanding of the value of that data, resulting in poor communication with others, especially in meetings when the decision should be made. By acknowledging the different dynamics of work competency and organizational structure and culture, organizations can make the necessary adjustments to best support their leaders. This paper reviews prior research studies and applies what is known to assist with current obstacles. This paper addresses how analytical leadership will assist in overcoming challenges in a data-driven organization's work environment.Keywords: digital leadership, big data, leadership style, digital leadership challenge
Procedia PDF Downloads 6934612 A Method for Allocation of Smart Intersections Using Traffic Information
Authors: Sang-Tae Ji, Jeong-Woo Park, Jun-Ho Park, Kwang-Woo Nam
Abstract:
This study aims is to suggest the basic factors by considering the priority of intersection in the diffusion project of Smart intersection. Busan Metropolitan City is conducting a smart intersection project for efficient traffic management. The smart intersection project aims to make breakthrough improvement of the intersection congestion by optimizing the signal system using CCTV (closed-circuit television camera) image analysis technology. This study investigated trends of existing researches and analyzed by setting three things of traffic volume, characteristics of intersection road, and whether or not to conduct the main arterial road as factors for selecting new intersection when spreading smart intersection. Using this, we presented the priority of the newly installed intersection through the present situation and analysis for the Busan Metropolitan City which is the main destination of the spreading project of the smart intersection. The results of this study can be used as a consideration in the implementation of smart intersection business.Keywords: CCTV, GIS, ICT, Smart City, smart intersection
Procedia PDF Downloads 38634611 On the Effectiveness of Educational Technology on the Promotion of Exceptional Children or Children with Special Needs
Authors: Nasrin Badrkhani
Abstract:
The increasing use of educational technologies has created a tremendous transformation in all fields and most importantly, in the field of education and learning. In recent decades, traditional learning approaches have undergone fundamental changes with the emergence of new learning technologies. Research shows that suitable educational tools play an effective role in the transmission, comprehension, and impact of educational concepts. These tools provide a tangible basis for thinking and constructing concepts, resulting in an increased interest in learning. They provide real and true experiences to students and convey educational meanings and concepts more quickly and clearly. It can be said that educational technology, as an active and modern teaching method, with capabilities such as engaging multiple senses in the educational process and involving the learner, makes the learning environment more flexible. It effectively impacts the skills of children with special needs by addressing their specific needs. Teachers are no longer the sole source of information, and students are not mere recipients of information. They are considered the main actors in the field of education and learning. Since education is one of the basic rights of every human being and children with special needs face unique challenges and obstacles in education, these challenges can negatively affect their abilities and learning. To combat these challenges, one of the ways is to use educational technologies for more diverse, effective learning. Also, the use of educational technology for students with special needs has increasingly proven effective in boosting their self-confidence and helping them overcome learning challenges, enhancing their learning outcomes.Keywords: communication technology, students with special needs, self-confidence, raising the expectations and progress
Procedia PDF Downloads 1434610 Students and Teachers Perceptions about Interactive Learning in Teaching Health Promotion Course: Implication for Nursing Education and Practice
Authors: Ahlam Alnatour
Abstract:
Background: To our knowledge, there is lack of studies that describe the experience of studying health promotion courses using an interactive approach, and compare students’ and teachers perceptions about this method of teaching. The purpose of this study is to provide a comparison between student and teacher experiences and perspectives in learning health promotion course using interactive learning. Design: A descriptive qualitative design was used to provide an in-depth description and understanding of students’ and teachers experiences and perceptions of learning health promotion courses using an interactive learning. Study Participants: About 14 fourteen students (seven male, seven female) and eight teachers at governmental university in northern Jordan participated in this study. Data Analysis: Conventional content analysis approach was used for participants’ scripts to gain an in-depth description for both students' and teacher’s experiences. Results: The main themes emerged from the data analysis describing the students’ and teachers perceptions of the interactive health promotion class: teachers’ and students positive experience in adopting interactive learning, advantages and benefits of interactive teaching, barriers to interactive teaching, and suggestions for improvement. Conclusion: Both teachers and students reflected positive attitudes toward interactive learning. Interactive learning helped to engage in learning process physically and cognitively. Interactive learning enhanced learning process, promote student attention, enhanced final performance, and satisfied teachers and students accordingly. Interactive learning approach should be adopted in teaching graduate and undergraduate courses using updated and contemporary strategies. Nursing scholars and educators should be motivated to integrate interactive learning in teaching different nursing courses.Keywords: interactive learning, nursing, health promotion, qualitative study
Procedia PDF Downloads 25034609 Agricultural Knowledge Management System Design, Use, and Consequence for Knowledge Sharing and Integration
Authors: Dejen Alemu, Murray E. Jennex, Temtim Assefa
Abstract:
This paper is investigated to understand the design, the use, and the consequence of Knowledge Management System (KMS) for knowledge systems sharing and integration. A KMS for knowledge systems sharing and integration is designed to meet the challenges raised by knowledge management researchers and practitioners: the technical, the human, and social factors. Agricultural KMS involves various members coming from different Communities of Practice (CoPs) who possess their own knowledge of multiple practices which need to be combined in the system development. However, the current development of the technology ignored the indigenous knowledge of the local communities, which is the key success factor for agriculture. This research employed the multi-methodological approach to KMS research in action research perspective which consists of four strategies: theory building, experimentation, observation, and system development. Using the KMS development practice of Ethiopian agricultural transformation agency as a case study, this research employed an interpretive analysis using primary qualitative data acquired through in-depth semi-structured interviews and participant observations. The Orlikowski's structuration model of technology has been used to understand the design, the use, and the consequence of the KMS. As a result, the research identified three basic components for the architecture of the shared KMS, namely, the people, the resources, and the implementation subsystems. The KMS were developed using web 2.0 tools to promote knowledge sharing and integration among diverse groups of users in a distributed environment. The use of a shared KMS allows users to access diverse knowledge from a number of users in different groups of participants, enhances the exchange of different forms of knowledge and experience, and creates high interaction and collaboration among participants. The consequences of a shared KMS on the social system includes, the elimination of hierarchical structure, enhance participation, collaboration, and negotiation among users from different CoPs having common interest, knowledge and skill development, integration of diverse knowledge resources, and the requirement of policy and guideline. The research contributes methodologically for the application of system development action research for understanding a conceptual framework for KMS development and use. The research have also theoretical contribution in extending structuration model of technology for the incorporation of variety of knowledge and practical implications to provide management understanding in developing strategies for the potential of web 2.0 tools for sharing and integration of indigenous knowledge.Keywords: communities of practice, indigenous knowledge, participation, structuration model of technology, Web 2.0 tools
Procedia PDF Downloads 25334608 Preliminary Analysis on Land Use-Land Cover Assessment of Post-Earthquake Geohazard: A Case Study in Kundasang, Sabah
Authors: Nur Afiqah Mohd Kamal, Khamarrul Azahari Razak
Abstract:
The earthquake aftermath has become a major concern, especially in high seismicity region. In Kundasang, Sabah, the earthquake on 5th June 2015 resulted in several catastrophes; landslides, rockfalls, mudflows and major slopes affected regardless of the series of the aftershocks. Certainly, the consequences of earthquake generate and induce the episodic disaster, not only life-threatening but it also affects infrastructure and economic development. Therefore, a need for investigating the change in land use and land cover (LULC) of post-earthquake geohazard is essential for identifying the extent of disastrous effects towards the development in Kundasang. With the advancement of remote sensing technology, post-earthquake geohazards (landslides, mudflows, rockfalls, debris flows) assessment can be evaluated by the employment of object-based image analysis in investigating the LULC change which consists of settlements, public infrastructure and vegetation cover. Therefore, this paper discusses the preliminary results on post-earthquakes geohazards distribution in Kundasang and evaluates the LULC classification effect upon the occurrences of geohazards event. The result of this preliminary analysis will provide an overview to determine the extent of geohazard impact on LULC. This research also provides beneficial input to the local authority in Kundasang about the risk of future structural development on the geohazard area.Keywords: geohazard, land use land cover, object-based image analysis, remote sensing
Procedia PDF Downloads 24534607 How Trust Functions in Fostering Innovation and Technology Development
Authors: Obidimma Ezezika
Abstract:
In light of the increasing importance of trust in development programs, the purpose of this study, was to identify how trust functions as an essential key determinant in successful innovation and technology development programs. Using projects in the agricultural sector as case studies, we determined how the concept of trust is understood. Our data collection relied on semi-structured, face-to-face interviews conducted as part of a larger study investigating the role of trust in development programs. Interview transcripts were analyzed to create a narrative on how trust is understood by the study’s participants and how trust functions in fostering innovation. We identified six themes and showed how trust plays an important factor in innovation. These themes included the practice of integrity and honesty; delivery of results in an accountable manner; capability and competency; sharing of the same objectives and interests; transparency about actions and intentions through clear communication; and the targeting of services toward the interests of the public. The results of this study can provide guidance on how to enhance implementation mechanisms and provide impetus for organizations to implement trust building activities in fostering effective innovation.Keywords: trust, research, innovation, technology
Procedia PDF Downloads 48234606 Dielectric Properties of PANI/h-BN Composites
Authors: Seyfullah Madakbas, Emrah Cakmakci
Abstract:
Polyaniline (PANI), the most studied member of the conductive polymers, has a wide range of uses from several electronic devices to various conductive high-technology applications. Boron nitride (BN) is a boron and nitrogen containing compound with superior chemical and thermal resistance and thermal conductivity. Even though several composites of PANI was prepared in literature, the preparation of h-BN/PANI composites is rare. In this work PANI was polymerized in the presence of different amounts of h-BN (1, 3 and 5% with respect to PANI) by using 0.1 M solution of NH4S2O8 in HCl as the oxidizing agent and conductive composites were prepared. Composites were structurally characterized with FTIR spectroscopy and X-Ray Diffraction (XRD). Thermal properties of conductive composites were determined by thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC). Dielectric measurements were performed in the frequency range of 106–108 Hz at room temperature. The corresponding bands for the benzenoid and quinoid rings at around 1593 and 1496 cm-1 in the FTIR spectra of the composites proved the formation of polyaniline. Together with the FTIR spectra, XRD analysis also revealed the existence of the interactions between PANI and h-BN. Glass transition temperatures (Tg) of the composites increased with the increasing amount of PANI (from 87 to 101). TGA revealed that the char yield of the composites increased as the amount of h-BN was increased in the composites. Finally the dielectric permittivity of 3 wt.%h-BN-containing composite was measured and found as approximately 17. This work was supported by Marmara University, Commission of Scientific Research Project.Keywords: dielectric permittivity, h-BN, PANI, thermal analysis
Procedia PDF Downloads 279