Search results for: Information and Communications Technology (ICT)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16546

Search results for: Information and Communications Technology (ICT)

11146 Recovery of Draw Solution in Forward Osmosis by Direct Contact Membrane Distillation

Authors: Su-Thing Ho, Shiao-Shing Chen, Hung-Te Hsu, Saikat Sinha Ray

Abstract:

Forward osmosis (FO) is an emerging technology for direct and indirect potable water reuse application. However, successful implementation of FO is still hindered by the lack of draw solution recovery with high efficiency. Membrane distillation (MD) is a thermal separation process by using hydrophobic microporous membrane that is kept in sandwich mode between warm feed stream and cold permeate stream. Typically, temperature difference is the driving force of MD which attributed by the partial vapor pressure difference across the membrane. In this study, the direct contact membrane distillation (DCMD) system was used to recover diluted draw solution of FO. Na3PO4 at pH 9 and EDTA-2Na at pH 8 were used as the feed solution for MD since it produces high water flux and minimized salt leakage in FO process. At high pH, trivalent and tetravalent ions are much easier to remain at draw solution side in FO process. The result demonstrated that PTFE with pore size of 1 μm could achieve the highest water flux (12.02 L/m2h), followed by PTFE 0.45 μm (10.05 L/m2h), PTFE 0.1 μm (7.38 L/m2h) and then PP (7.17 L/m2h) while using 0.1 M Na3PO4 draw solute. The concentration of phosphate and conductivity in the PTFE (0.45 μm) permeate were low as 1.05 mg/L and 2.89 μm/cm respectively. Although PTFE with the pore size of 1 μm could obtain the highest water flux, but the concentration of phosphate in permeate was higher than other kinds of MD membranes. This study indicated that four kinds of MD membranes performed well and PTFE with the pore size of 0.45 μm was the best among tested membranes to achieve high water flux and high rejection of phosphate (99.99%) in recovery of diluted draw solution. Besides that, the results demonstrate that it can obtain high water flux and high rejection of phosphate when operated with cross flow velocity of 0.103 m/s with Tfeed of 60 ℃ and Tdistillate of 20 ℃. In addition to that, the result shows that Na3PO4 is more suitable for recovery than EDTA-2Na. Besides that, while recovering the diluted Na3PO4, it can obtain the high purity of permeate water. The overall performance indicates that, the utilization of DCMD is a promising technology to recover the diluted draw solution for FO process.

Keywords: membrane distillation, forward osmosis, draw solution, recovery

Procedia PDF Downloads 180
11145 Contrastive Learning for Unsupervised Object Segmentation in Sequential Images

Authors: Tian Zhang

Abstract:

Unsupervised object segmentation aims at segmenting objects in sequential images and obtaining the mask of each object without any manual intervention. Unsupervised segmentation remains a challenging task due to the lack of prior knowledge about these objects. Previous methods often require manually specifying the action of each object, which is often difficult to obtain. Instead, this paper does not need action information of objects and automatically learns the actions and relations among objects from the structured environment. To obtain the object segmentation of sequential images, the relationships between objects and images are extracted to infer the action and interaction of objects based on the multi-head attention mechanism. Three types of objects’ relationships in the object segmentation task are proposed: the relationship between objects in the same frame, the relationship between objects in two frames, and the relationship between objects and historical information. Based on these relationships, the proposed model (1) is effective in multiple objects segmentation tasks, (2) just needs images as input, and (3) produces better segmentation results as more relationships are considered. The experimental results on multiple datasets show that this paper’s method achieves state-of-art performance. The quantitative and qualitative analyses of the result are conducted. The proposed method could be easily extended to other similar applications.

Keywords: unsupervised object segmentation, attention mechanism, contrastive learning, structured environment

Procedia PDF Downloads 101
11144 Cross Site Scripting (XSS) Attack and Automatic Detection Technology Research

Authors: Tao Feng, Wei-Wei Zhang, Chang-Ming Ding

Abstract:

Cross-site scripting (XSS) is one of the most popular WEB Attacking methods at present, and also one of the most risky web attacks. Because of the population of JavaScript, the scene of the cross site scripting attack is also gradually expanded. However, since the web application developers tend to only focus on functional testing and lack the awareness of the XSS, which has made the on-line web projects exist many XSS vulnerabilities. In this paper, different various techniques of XSS attack are analyzed, and a method automatically to detect it is proposed. It is easy to check the results of vulnerability detection when running it as a plug-in.

Keywords: XSS, no target attack platform, automatic detection,XSS detection

Procedia PDF Downloads 396
11143 Mesoporous Material Nanofibers by Electrospinning

Authors: Sh. Sohrabnezhad, A. Jafarzadeh

Abstract:

In this paper, MCM-41 mesoporous material nanofibers were synthesized by an electrospinning technique. The nanofibers were characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), x-ray diffraction (XRD), and nitrogen adsorption–desorption measurement. Tetraethyl orthosilicate (TEOS) and polyvinyl alcohol (PVA) were used as a silica source and fiber forming source, respectively. TEM and SEM images showed synthesis of MCM-41 nanofibers with a diameter of 200 nm. The pore diameter and surface area of calcined MCM-41 nanofibers was 2.2 nm and 970 m2/g, respectively. The morphology of the MCM-41 nanofibers depended on spinning voltages.

Keywords: electrospinning, electron microscopy, fiber technology, porous materials, X-ray techniques

Procedia PDF Downloads 243
11142 Residual Dipolar Couplings in NMR Spectroscopy Using Lanthanide Tags

Authors: Elias Akoury

Abstract:

Nuclear Magnetic Resonance (NMR) spectroscopy is an indispensable technique used in structure determination of small and macromolecules to study their physical properties, elucidation of characteristic interactions, dynamics and thermodynamic processes. Quantum mechanics defines the theoretical description of NMR spectroscopy and treatment of the dynamics of nuclear spin systems. The phenomenon of residual dipolar coupling (RDCs) has become a routine tool for accurate structure determination by providing global orientation information of magnetic dipole-dipole interaction vectors within a common reference frame. This offers accessibility of distance-independent angular information and insights to local relaxation. The measurement of RDCs requires an anisotropic orientation medium for the molecules to partially align along the magnetic field. This can be achieved by introduction of liquid crystals or attaching a paramagnetic center. Although anisotropic paramagnetic tags continue to mark achievements in the biomolecular NMR of large proteins, its application in small organic molecules remains unspread. Here, we propose a strategy for the synthesis of a lanthanide tag and the measurement of RDCs in organic molecules using paramagnetic lanthanide complexes.

Keywords: lanthanide tags, NMR spectroscopy, residual dipolar coupling, quantum mechanics of spin dynamics

Procedia PDF Downloads 186
11141 The Study of Cost Accounting in S Company Based on TDABC

Authors: Heng Ma

Abstract:

Third-party warehousing logistics has an important role in the development of external logistics. At present, the third-party logistics in our country is still a new industry, the accounting system has not yet been established, the current financial accounting system of third-party warehousing logistics is mainly in the traditional way of thinking, and only able to provide the total cost information of the entire enterprise during the accounting period, unable to reflect operating indirect cost information. In order to solve the problem of third-party logistics industry cost information distortion, improve the level of logistics cost management, the paper combines theoretical research and case analysis method to reflect cost allocation by building third-party logistics costing model using Time-Driven Activity-Based Costing(TDABC), and takes S company as an example to account and control the warehousing logistics cost. Based on the idea of “Products consume activities and activities consume resources”, TDABC put time into the main cost driver and use time-consuming equation resources assigned to cost objects. In S company, the objects focuses on three warehouse, engaged with warehousing and transportation (the second warehouse, transport point) service. These three warehouse respectively including five departments, Business Unit, Production Unit, Settlement Center, Security Department and Equipment Division, the activities in these departments are classified by in-out of storage forecast, in-out of storage or transit and safekeeping work. By computing capacity cost rate, building the time-consuming equation, the paper calculates the final operation cost so as to reveal the real cost. The numerical analysis results show that the TDABC can accurately reflect the cost allocation of service customers and reveal the spare capacity cost of resource center, verifies the feasibility and validity of TDABC in third-party logistics industry cost accounting. It inspires enterprises focus on customer relationship management and reduces idle cost to strengthen the cost management of third-party logistics enterprises.

Keywords: third-party logistics enterprises, TDABC, cost management, S company

Procedia PDF Downloads 355
11140 A System Architecture for Hand Gesture Control of Robotic Technology: A Case Study Using a Myo™ Arm Band, DJI Spark™ Drone, and a Staubli™ Robotic Manipulator

Authors: Sebastian van Delden, Matthew Anuszkiewicz, Jayse White, Scott Stolarski

Abstract:

Industrial robotic manipulators have been commonplace in the manufacturing world since the early 1960s, and unmanned aerial vehicles (drones) have only begun to realize their full potential in the service industry and the military. The omnipresence of these technologies in their respective fields will only become more potent in coming years. While these technologies have greatly evolved over the years, the typical approach to human interaction with these robots has not. In the industrial robotics realm, a manipulator is typically jogged around using a teach pendant and programmed using a networked computer or the teach pendant itself via a proprietary software development platform. Drones are typically controlled using a two-handed controller equipped with throttles, buttons, and sticks, an app that can be downloaded to one’s mobile device, or a combination of both. This application-oriented work offers a novel approach to human interaction with both unmanned aerial vehicles and industrial robotic manipulators via hand gestures and movements. Two systems have been implemented, both of which use a Myo™ armband to control either a drone (DJI Spark™) or a robotic arm (Stäubli™ TX40). The methodologies developed by this work present a mapping of armband gestures (fist, finger spread, swing hand in, swing hand out, swing arm left/up/down/right, etc.) to either drone or robot arm movements. The findings of this study present the efficacy and limitations (precision and ergonomic) of hand gesture control of two distinct types of robotic technology. All source code associated with this project will be open sourced and placed on GitHub. In conclusion, this study offers a framework that maps hand and arm gestures to drone and robot arm control. The system has been implemented using current ubiquitous technologies, and these software artifacts will be open sourced for future researchers or practitioners to use in their work.

Keywords: human robot interaction, drones, gestures, robotics

Procedia PDF Downloads 149
11139 Improved Operating Strategies for the Optimization of Proton Exchange Membrane Fuel Cell System Performance

Authors: Guillaume Soubeyran, Fabrice Micoud, Benoit Morin, Jean-Philippe Poirot-Crouvezier, Magali Reytier

Abstract:

Proton Exchange Membrane Fuel Cell (PEMFC) technology is considered as a solution for the reduction of CO2 emissions. However, this technology still meets several challenges for high-scale industrialization. In this context, the increase of durability remains a critical aspect for competitiveness of this technology. Fortunately, performance degradations in nominal operating conditions is partially reversible, meaning that if specific conditions are applied, a partial recovery of fuel cell performance can be achieved, while irreversible degradations can only be mitigated. Thus, it is worth studying the optimal conditions to rejuvenate these reversible degradations and assessing the long-term impact of such procedures on the performance of the cell. Reversible degradations consist mainly of anode Pt active sites poisoning by carbon monoxide at the anode, heterogeneities in water management during use, and oxidation/deactivation of Pt active sites at the cathode. The latter is identified as a major source of reversible performance loss caused by the presence oxygen, high temperature and high cathode potential that favor platinum oxidation, especially in high efficiency operating points. Hence, we studied here a recovery procedure aiming at reducing the platinum oxides by decreasing cathode potential during operation. Indeed, the application of short air starvation phase leads to a drop of cathode potential. Cell performances are temporarily increased afterwards. Nevertheless, local temperature and current heterogeneities within the cells are favored and shall be minimized. The consumption of fuel during the recovery phase shall also be considered to evaluate the global efficiency. Consequently, the purpose of this work is to find an optimal compromise between the recovery of reversible degradations by air starvation, the increase of global cell efficiency and the mitigation of irreversible degradations effects. Different operating parameters have first been studied such as cell voltage, temperature and humidity in single cell set-up. Considering the global PEMFC system efficiency, tests showed that reducing duration of recovery phase and reducing cell voltage was the key to ensure an efficient recovery. Recovery phase frequency was a major factor as well. A specific method was established to find the optimal frequency depending on the duration and voltage of the recovery phase. Then, long-term degradations have also been studied by applying FC-DLC cycles based on NEDC cycles on a 4-cell short stack by alternating test sequences with and without recovery phases. Depending on recovery phase timing, cell efficiency during the cycle was increased up to 2% thanks to a mean voltage increase of 10 mV during test sequences with recovery phases. However, cyclic voltammetry tests results suggest that the implementation of recovery phases causes an acceleration of the decrease of platinum active areas that could be due to the high potential variations applied to the cathode electrode during operation.

Keywords: durability, PEMFC, recovery procedure, reversible degradation

Procedia PDF Downloads 127
11138 A Novel PWM/PFM Controller for PSR Fly-Back Converter Using a New Peak Sensing Technique

Authors: Sanguk Nam, Van Ha Nguyen, Hanjung Song

Abstract:

For low-power applications such as adapters for portable devices and USB chargers, the primary side regulation (PSR) fly-back converter is widely used in lieu of the conventional fly-back converter using opto-coupler because of its simpler structure and lower cost. In the literature, there has been studies focusing on the design of PSR circuit; however, the conventional sensing method in PSR circuit using RC delay has a lower accuracy as compared to the conventional fly-back converter using opto-coupler. In this paper, we propose a novel PWM/PFM controller using new sensing technique for the PSR fly-back converter which can control an accurate output voltage. The conventional PSR circuit can sense the output voltage information from the auxiliary winding to regulate the duty cycle of the clock that control the output voltage. In the sensing signal waveform, there has two transient points at time the voltage equals to Vout+VD and Vout, respectively. In other to sense the output voltage, the PSR circuit must detect the time at which the current of the diode at the output equals to zero. In the conventional PSR flyback-converter, the sensing signal at this time has a non-sharp-negative slope that might cause a difficulty in detecting the output voltage information since a delay of sensing signal or switching clock may exist which brings out an unstable operation of PSR fly-back converter. In this paper instead of detecting output voltage at a non-sharp-negative slope, a sharp-positive slope is used to sense the proper information of the output voltage. The proposed PRS circuit consists of a saw-tooth generator, a summing circuit, a sample and hold circuit and a peak detector. Besides, there is also the start-up circuit which protects the chip from high surge current when the converter is turned on. Additionally, to reduce the standby power loss, a second mode which operates in a low frequency is designed beside the main mode at high frequency. In general, the operation of the proposed PSR circuit can be summarized as following: At the time the output information is sensed from the auxiliary winding, a saw-tooth signal from the saw-tooth generator is generated. Then, both of these signals are summed using a summing circuit. After this process, the slope of the peak of the sensing signal at the time diode current is zero becomes positive and sharp that make the peak easy to detect. The output of the summing circuit then is fed into a peak detector and the sample and hold circuit; hence, the output voltage can be properly sensed. By this way, we can sense more accurate output voltage information and extend margin even circuit is delayed or even there is the existence of noise by using only a simple circuit structure as compared with conventional circuits while the performance can be sufficiently enhanced. Circuit verification was carried out using 0.35μm 700V Magnachip process. The simulation result of sensing signal shows a maximum error of 5mV under various load and line conditions which means the operation of the converter is stable. As compared to the conventional circuit, we achieved very small error only used analog circuits compare with conventional circuits. In this paper, a PWM/PFM controller using a simple and effective sensing method for PSR fly-back converter has been presented in this paper. The circuit structure is simple as compared with the conventional designs. The gained results from simulation confirmed the idea of the design

Keywords: primary side regulation, PSR, sensing technique, peak detector, PWM/PFM control, fly-back converter

Procedia PDF Downloads 330
11137 Optimizing Foaming Agents by Air Compression to Unload a Liquid Loaded Gas Well

Authors: Mhenga Agneta, Li Zhaomin, Zhang Chao

Abstract:

When velocity is high enough, gas can entrain fluid and carry to the surface, but as time passes by, velocity drops to a critical point where fluids will start to hold up in the tubing and cause liquid loading which prevents gas production and may lead to the death of the well. Foam injection is widely used as one of the methods to unload liquid. Since wells have different characteristics, it is not guaranteed that foam can be applied in all of them and bring successful results. This research presents a technology to optimize the efficiency of foam to unload liquid by air compression. Two methods are used to explain optimization; (i) mathematical formulas are used to solve and explain the myth of how density and critical velocity could be minimized when air is compressed into foaming agents, then the relationship between flow rates and pressure increase which would boost up the bottom hole pressure and increase the velocity to lift liquid to the surface. (ii) Experiments to test foam carryover capacity and stability as a function of time and surfactant concentration whereby three surfactants anionic sodium dodecyl sulfate (SDS), nonionic Triton 100 and cationic hexadecyltrimethylammonium bromide (HDTAB) were probed. The best foaming agents were injected to lift liquid loaded in a created vertical well model of 2.5 cm diameter and 390 cm high steel tubing covered by a transparent glass casing of 5 cm diameter and 450 cm high. The results show that, after injecting foaming agents, liquid unloading was successful by 75%; however, the efficiency of foaming agents to unload liquid increased by 10% with an addition of compressed air at a ratio of 1:1. Measured values and calculated values were compared and brought about ± 3% difference which is a good number. The successful application of the technology indicates that engineers and stakeholders could bring water flooded gas wells back to production with optimized results by firstly paying attention to the type of surfactants (foaming agents) used, concentration of surfactants, flow rates of the injected surfactants then compressing air to the foaming agents at a proper ratio.

Keywords: air compression, foaming agents, gas well, liquid loading

Procedia PDF Downloads 129
11136 Recent Advances of Isolated Microspore Culture Response in Durum Wheat

Authors: Zelikha Labbani

Abstract:

Many biotechnology methods have been used in plant breeding programs. The in vitro isolated microspore culture is the one of these methods. For durum wheat, the use of this technology has been limited for a long time due to the low number of embryos produced and also most regeneration plants are albina. The objective of this paper is to show that using isolated microspores culture on durum wheat is possible due to the development of the new methods using the new pretreatment of the microspores before their isolation and cultivation.

Keywords: isolated microspore culture, pretreatments, in vitro embryogenesis, plant breeding program

Procedia PDF Downloads 524
11135 The Application of Insects in Forensic Investigations

Authors: Shirin Jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Forensic entomology is the science of study and analysis of insects evidences to aid in criminal investigation. Being aware of the distribution, biology, ecology and behavior of insects, which are founded at crime scene can provide information about when, where and how the crime has been committed. It has many application in criminal investigations. Its main use is estimation of the minimum time after death in suspicious death. The close association between insects and corpses and the use of insects in criminal investigations is the subject of forensic entomology. Because insects attack to the decomposing corpse and spawning on it from the initial stages. Forensic scientists can estimate the postmortem index by studying the insects population and the developing larval stages.In addition, toxicological and molecular studies of these insects can reveal the cause of death or even the identity of a victim. It also be used to detect drugs and poisons, and determination of incident location. Gathering robust entomological evidences is made possible for experts by recent Techniques. They can provide vital information about death, corpse movement or burial, submersion interval, time of decapitation, identification of specific sites of trauma, post-mortem artefacts on the body, use of drugs, linking a suspect to the scene of a crime, sexual molestations and the identification of suspects.

Keywords: Forensic entomology, post mortem interval, insects, larvae

Procedia PDF Downloads 498
11134 Electronic Data Interchange (EDI) in the Supply Chain: Impact on Customer Satisfaction

Authors: Hicham Amine, Abdelouahab Mesnaoui

Abstract:

Electronic data interchange EDI is the computer-to-computer exchange of structured business information. This information typically takes the form of standardized electronic business documents, such as invoices, purchase orders, bills of lading, and so on. The purpose of this study is to identify the impact EDI might have on supply chain and typically on customer satisfaction keeping in mind the constraints the organization might face. This study included 139 subject matter experts (SMEs) who participated by responding to a survey that was distributed. 85% responded that they are extremely for the implementation while 10% were neutral and 5% were against the implementation. From the quality assurance department, we have got 75% from the clients agreed to move on with the change whereas 10% stayed neutral and finally 15% were against the change. From the legal department where 80% of the answers were for the implementation and 10% of the participants stayed neutral whereas the last 10% were against it. The survey consisted of 40% male and 60% female (sex-ratio (F/M=1,5), who had chosen to participate. Our survey also contained 3 categories in terms of technical background where 80% are from technical background and 15% were from nontechnical background and 5% had some average technical background. This study examines the impact of EDI on customer satisfaction which is the primary hypothesis and justifies the importance of the implementation which enhances the customer satisfaction.

Keywords: electronic data interchange, supply chain, subject matter experts, customer satisfaction

Procedia PDF Downloads 332
11133 Leveraging Advanced Technologies and Data to Eliminate Abandoned, Lost, or Otherwise Discarded Fishing Gear and Derelict Fishing Gear

Authors: Grant Bifolchi

Abstract:

As global environmental problems continue to have highly adverse effects, finding long-term, sustainable solutions to combat ecological distress are of growing paramount concern. Ghost Gear—also known as abandoned, lost or otherwise discarded fishing gear (ALDFG) and derelict fishing gear (DFG)—represents one of the greatest threats to the world’s oceans, posing a significant hazard to human health, livelihoods, and global food security. In fact, according to the UN Food and Agriculture Organization (FAO), abandoned, lost and discarded fishing gear represents approximately 10% of marine debris by volume. Around the world, many governments, governmental and non-profit organizations are doing their best to manage the reporting and retrieval of nets, lines, ropes, traps, floats and more from their respective bodies of water. However, these organizations’ ability to effectively manage files and documents about the environmental problem further complicates matters. In Ghost Gear monitoring and management, organizations face additional complexities. Whether it’s data ingest, industry regulations and standards, garnering actionable insights into the location, security, and management of data, or the application of enforcement due to disparate data—all of these factors are placing massive strains on organizations struggling to save the planet from the dangers of Ghost Gear. In this 90-minute educational session, globally recognized Ghost Gear technology expert Grant Bifolchi CET, BBA, Bcom, will provide real-world insight into how governments currently manage Ghost Gear and the technology that can accelerate success in combatting ALDFG and DFG. In this session, attendees will learn how to: • Identify specific technologies to solve the ingest and management of Ghost Gear data categories, including type, geo-location, size, ownership, regional assignment, collection and disposal. • Provide enhanced access to authorities, fisheries, independent fishing vessels, individuals, etc., while securely controlling confidential and privileged data to globally recognized standards. • Create and maintain processing accuracy to effectively track ALDFG/DFG reporting progress—including acknowledging receipt of the report and sharing it with all pertinent stakeholders to ensure approvals are secured. • Enable and utilize Business Intelligence (BI) and Analytics to store and analyze data to optimize organizational performance, maintain anytime-visibility of report status, user accountability, scheduling, management, and foster governmental transparency. • Maintain Compliance Reporting through highly defined, detailed and automated reports—enabling all stakeholders to share critical insights with internal colleagues, regulatory agencies, and national and international partners.

Keywords: ghost gear, ALDFG, DFG, abandoned, lost or otherwise discarded fishing gear, data, technology

Procedia PDF Downloads 88
11132 Application of Vector Representation for Revealing the Richness of Meaning of Facial Expressions

Authors: Carmel Sofer, Dan Vilenchik, Ron Dotsch, Galia Avidan

Abstract:

Studies investigating emotional facial expressions typically reveal consensus among observes regarding the meaning of basic expressions, whose number ranges between 6 to 15 emotional states. Given this limited number of discrete expressions, how is it that the human vocabulary of emotional states is so rich? The present study argues that perceivers use sequences of these discrete expressions as the basis for a much richer vocabulary of emotional states. Such mechanisms, in which a relatively small number of basic components is expanded to a much larger number of possible combinations of meanings, exist in other human communications modalities, such as spoken language and music. In these modalities, letters and notes, which serve as basic components of spoken language and music respectively, are temporally linked, resulting in the richness of expressions. In the current study, in each trial participants were presented with sequences of two images containing facial expression in different combinations sampled out of the eight static basic expressions (total 64; 8X8). In each trial, using single word participants were required to judge the 'state of mind' portrayed by the person whose face was presented. Utilizing word embedding methods (Global Vectors for Word Representation), employed in the field of Natural Language Processing, and relying on machine learning computational methods, it was found that the perceived meanings of the sequences of facial expressions were a weighted average of the single expressions comprising them, resulting in 22 new emotional states, in addition to the eight, classic basic expressions. An interaction between the first and the second expression in each sequence indicated that every single facial expression modulated the effect of the other facial expression thus leading to a different interpretation ascribed to the sequence as a whole. These findings suggest that the vocabulary of emotional states conveyed by facial expressions is not restricted to the (small) number of discrete facial expressions. Rather, the vocabulary is rich, as it results from combinations of these expressions. In addition, present research suggests that using word embedding in social perception studies, can be a powerful, accurate and efficient tool, to capture explicit and implicit perceptions and intentions. Acknowledgment: The study was supported by a grant from the Ministry of Defense in Israel to GA and CS. CS is also supported by the ABC initiative in Ben-Gurion University of the Negev.

Keywords: Glove, face perception, facial expression perception. , facial expression production, machine learning, word embedding, word2vec

Procedia PDF Downloads 173
11131 The Effect of Knowledge Management in Lean Organization

Authors: Mehrnoosh Askarizadeh

Abstract:

In an ever changeable and globalized world with new economic and global competitors competing for the same customers and resources, is increasing the pressure on organizations' competitiveness. In addition, organizations faces additional challenges due to an ever-growing amount of data and the ever-bigger challenge of analyzing that data and keeping the data secure. Successful companies are characterized by exploiting their intellectual capital in an efficient manner. Thus, the most valuable asset an organization has today has become its employees' knowledge. To enable this, there is a tool that supports easier handling and optimizes the use of knowledge, which is knowledge management. Based on the theoretical framework and careful review as well as analysis of interviews and observations resulted in six essential areas: structure, management, compensation, communication, trust and motivation. The analysis showed that the scientific articles and literature have different perspectives, different definitions and are based on different theories but the essence is that they all finally seems to arrive at the same result and conclusion, although with different viewpoints and perspectives. This is regardless of whether the focus is on management style, rewards or communication they all focus on the individual. The conclusion is that organizational culture affects knowledge management and dissemination of information, because of its direct impact on the individual. The largest and most important underlying factor why we choose to participate in improvement work or share knowledge is our motivation. Motivation is the reason for and the reason behind our actions.

Keywords: lean, lean production, knowledge management, information management, motivation

Procedia PDF Downloads 513
11130 Robotics and Embedded Systems Applied to the Buried Pipeline Inspection

Authors: Robson C. Santos, Julio C. P. Ribeiro, Iorran M. de Castro, Luan C. F. Rodrigues, Sandro R. L. Silva, Diego M. Quesada

Abstract:

The work aims to develop a robot in the form of autonomous vehicle to detect, inspection and mapping of underground pipelines through the ATmega328 Arduino platform. Hardware prototyping very similar to C / C ++ language that facilitates its use in robotics open source, resembles PLC used in large industrial processes. The robot will traverse the surface independently of direct human action, in order to automate the process of detecting buried pipes, guided by electromagnetic induction. The induction comes from coils that sends the signal to the Arduino microcontroller contained in that will make the difference in intensity and the treatment of the information, then this determines actions to electrical components such as relays and motors, allowing the prototype to move on the surface and getting the necessary information. The robot was developed by electrical and electronic assemblies that allowed test your application. The assembly is made up of metal detector coils, circuit boards and microprocessor, which interconnected circuits previously developed can determine, process control and mechanical actions for a robot (autonomous car) that will make the detection and mapping of buried pipelines plates.

Keywords: robotic, metal detector, embedded system, pipeline inspection

Procedia PDF Downloads 609
11129 Spatio-Temporal Data Mining with Association Rules for Lake Van

Authors: Tolga Aydin, M. Fatih Alaeddinoğlu

Abstract:

People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.

Keywords: apriori algorithm, association rules, data mining, spatio-temporal data

Procedia PDF Downloads 365
11128 Genetic Counseling for Severe Mental Disorders. Integrating Innovative Services and Prophylactic Interventions in an Online Platform - MENTALICA

Authors: Ramona Moldovan, Doina Cosman, Sebastian Moldovan, Radu Popp, Victor Pop

Abstract:

MENTALICA is a project aimed at developing and evaluating a platform that can assist individuals diagnosed with severe mental disorders and their families in managing the consequences associated with severe mental disorders, recurrence risks, prevention strategies and treatment options. MENTALICA is a platform based on guidance issued by some of the most prominent scientific organizations in the world. In order to personalize the information provided, the program explores details about the personal and family history of mental disorders. MENTALICA summarizes the answers and gives respondents a personal assessment. This includes personalized information and support about schizophrenia, bipolar disorder and schizoaffective disorder. MENTALICA includes several modules: Family history tools, Risk assessment tools and Risk factor sheets, Practical guides for patients, Practical guides for families, Guidelines for clinicians. Currently, there are no available guidelines for genetic counselling for mental disorders. Respondents can print out their reports and discuss them with family members or their doctors. We will briefly present the current status of MENTALICA and its implications for patients, professionals and the community.

Keywords: genetic counseling, mental disorders, platform

Procedia PDF Downloads 486
11127 SolarSPELL Case Study: Pedagogical Quality Indicators to Evaluate Digital Library Resources

Authors: Lorena Alemán de la Garza, Marcela Georgina Gómez-Zermeño

Abstract:

This paper presents the SolarSPELL case study that aims to generate information on the use of indicators that help evaluate the pedagogical quality of a digital library resources. SolarSPELL is a solar-powered digital library with WiFi connectivity. It offers a variety of open educational resources selected for their potential for the digital transformation of educational practices and the achievement of the 2030 Agenda for Sustainable Development, adopted by all United Nations Member States. The case study employed a quantitative methodology and the research instrument was applied to 55 teachers, directors and librarians. The results indicate that it is possible to strengthen the pedagogical quality of open educational resources, through actions focused on improving temporal and technological parameters. They also reveal that users believe that SolarSPELL improves the teaching-learning processes and motivates the teacher to improve his or her development. This study provides valuable information on a tool that supports teaching-learning processes and facilitates connectivity with renewable energies that improves the teacher training in active methodologies for ecosystem learning.

Keywords: educational innovation, digital library, pedagogical quality, solar energy, teacher training, sustainable development

Procedia PDF Downloads 119
11126 Principles and Practice of Therapeutic Architecture

Authors: Umedov Mekhroz, Griaznova Svetlana

Abstract:

The quality of life and well-being of patients, staff and visitors are central to the delivery of health care. Architecture and design are becoming an integral part of the healing and recovery approach. The most significant point that can be implemented in hospital buildings is the therapeutic value of the artificial environment, the design and integration of plants to bring the natural world into the healthcare environment. The hospital environment should feel like home comfort. The techniques that therapeutic architecture uses are very cheap, but provide real benefit to patients, staff and visitors, demonstrating that the difference is not in cost but in design quality. The best environment is not necessarily more expensive - it is about special use of light and color, rational use of materials and flexibility of premises. All this forms innovative concepts in modern hospital architecture, in new construction, renovation or expansion projects. The aim of the study is to identify the methods and principles of therapeutic architecture. The research methodology consists in studying and summarizing international experience in scientific research, literature, standards, methodological manuals and project materials on the research topic. The result of the research is the development of graphic-analytical tables based on the system analysis of the processed information; 3d visualization of hospital interiors based on processed information.

Keywords: therapeutic architecture, healthcare interiors, sustainable design, materials, color scheme, lighting, environment.

Procedia PDF Downloads 119
11125 Exploring Ways Early Childhood Teachers Integrate Information and Communication Technologies into Children's Play: Two Case Studies from the Australian Context

Authors: Caroline Labib

Abstract:

This paper reports on a qualitative study exploring the approaches teachers used to integrate computers or smart tablets into their program planning. Their aim was to integrate ICT into children’s play, thereby supporting children’s learning and development. Data was collected in preschool settings in Melbourne in 2016. Interviews with teachers, observations of teacher interactions with children and copies of teachers’ planning and observation documents informed the study. The paper looks closely at findings from two early childhood settings and focuses on exploring the differing approaches two EC teachers have adopted when integrating iPad or computers into their settings. Data analysis revealed three key approaches which have been labelled: free digital play, guided digital play and teacher-led digital use. Importantly, teacher decisions were influenced by the interplay between the opportunities that the ICT tools offered, the teachers’ prior knowledge and experience about ICT and children’s learning needs and contexts. This paper is a snapshot of two early childhood settings, and further research will encompass data from six more early childhood settings in Victoria with the aim of exploring a wide range of motivating factors for early childhood teachers trying to integrate ICT into their programs.

Keywords: early childhood education (ECE), digital play, information and communication technologies (ICT), play, and teachers' interaction approaches

Procedia PDF Downloads 204
11124 Decommissioning of Nuclear Power Plants: The Current Position and Requirements

Authors: A. Stifi, S. Gentes

Abstract:

Undoubtedly from construction's perspective, the use of explosives will remove a large facility such as a 40-storey building , that took almost 3 to 4 years for construction, in few minutes. Usually, the reconstruction or decommissioning, the last phase of life cycle of any facility, is considered to be the shortest. However, this is proved to be wrong in the case of nuclear power plant. Statistics says that in the last 30 years, the construction of a nuclear power plant took an average time of 6 years whereas it is estimated that decommissioning of such plants may take even a decade or more. This paper is all about the decommissioning phase of a nuclear power plant which needs to be given more attention and encouragement from the research institutes as well as the nuclear industry. Currently, there are 437 nuclear power reactors in operation and 70 reactors in construction. With around 139 nuclear facilities already been shut down and are in different decommissioning stages and approximately 347 nuclear reactors will be in decommissioning phase in the next 20 years (assuming the operation time of a reactor as 40 years), This fact raises the following two questions (1) How far is the nuclear and construction Industry ready to face the challenges of decommissioning project? (2) What is required for a safety and reliable decommissioning project delivery? The decommissioning of nuclear facilities across the global have severe time and budget overruns. Largely the decommissioning processes are being executed by the force of manual labour where the change in regulations is respectively observed. In term of research and development, some research projects and activities are being carried out in this area, but the requirement seems to be much more. The near future of decommissioning shall be better through a sustainable development strategy where all stakeholders agree to implement innovative technologies especially for dismantling and decontamination processes and to deliever a reliable and safety decommissioning. The scope of technology transfer from other industries shall be explored. For example, remotery operated robotic technologies used in automobile and production industry to reduce time and improve effecincy and saftey shall be tried here. However, the innovative technologies are highly requested but they are alone not enough, the implementation of creative and innovative management methodologies should be also investigated and applied. Lean Management with it main concept "elimination of waste within process", is a suitable example here. Thus, the cooperation between international organisations and related industries and the knowledge-sharing may serve as a key factor for the successful decommissioning projects.

Keywords: decommissioning of nuclear facilities, innovative technology, innovative management, sustainable development

Procedia PDF Downloads 464
11123 Developing Integrated Model for Building Design and Evacuation Planning

Authors: Hao-Hsi Tseng, Hsin-Yun Lee

Abstract:

In the process of building design, the designers have to complete the spatial design and consider the evacuation performance at the same time. It is usually difficult to combine the two planning processes and it results in the gap between spatial design and evacuation performance. Then the designers cannot complete an integrated optimal design solution. In addition, the evacuation routing models proposed by previous researchers is different from the practical evacuation decisions in the real field. On the other hand, more and more building design projects are executed by Building Information Modeling (BIM) in which the design content is formed by the object-oriented framework. Thus, the integration of BIM and evacuation simulation can make a significant contribution for designers. Therefore, this research plan will establish a model that integrates spatial design and evacuation planning. The proposed model will provide the support for the spatial design modifications and optimize the evacuation planning. The designers can complete the integrated design solution in BIM. Besides, this research plan improves the evacuation routing method to make the simulation results more practical. The proposed model will be applied in a building design project for evaluation and validation when it will provide the near-optimal design suggestion. By applying the proposed model, the integration and efficiency of the design process are improved and the evacuation plan is more useful. The quality of building spatial design will be better.

Keywords: building information modeling, evacuation, design, floor plan

Procedia PDF Downloads 450
11122 Food Composition Tables Used as an Instrument to Estimate the Nutrient Ingest in Ecuador

Authors: Ortiz M. Rocío, Rocha G. Karina, Domenech A. Gloria

Abstract:

There are several tools to assess the nutritional status of the population. A main instrument commonly used to build those tools is the food composition tables (FCT). Despite the importance of FCT, there are many error sources and variability factors that can be presented on building those tables and can lead to an under or over estimation of ingest of nutrients of a population. This work identified different food composition tables used as an instrument to estimate the nutrient ingest in Ecuador.The collection of data for choosing FCT was made through key informants –self completed questionnaires-, supplemented with institutional web research. A questionnaire with general variables (origin, year of edition, etc) and methodological variables (method of elaboration, information of the table, etc) was passed to the identified FCT. Those variables were defined based on an extensive literature review. A descriptive analysis of content was performed. Ten printed tables and three databases were reported which were all indistinctly treated as food composition tables. We managed to get information from 69% of the references. Several informants referred to printed documents that were not accessible. In addition, searching the internet was not successful. Of the 9 final tables, n=8 are from Latin America, and, n= 5 of these were constructed by indirect method (collection of already published data) having as a main source of information a database from the United States department of agriculture USDA. One FCT was constructed by using direct method (bromatological analysis) and has its origin in Ecuador. The 100% of the tables made a clear distinction of the food and its method of cooking, 88% of FCT expressed values of nutrients per 100g of edible portion, 77% gave precise additional information about the use of the table, and 55% presented all the macro and micro nutrients on a detailed way. The more complete FCT were: INCAP (Central America), Composition of foods (Mexico). The more referred table was: Ecuadorian food composition table of 1965 (70%). The indirect method was used for most tables within this study. However, this method has the disadvantage that it generates less reliable food composition tables because foods show variations in composition. Therefore, a database cannot accurately predict the composition of any isolated sample of a food product.In conclusion, analyzing the pros and cons, and, despite being a FCT elaborated by using an indirect method, it is considered appropriate to work with the FCT of INCAP Central America, given the proximity to our country and a food items list that is very similar to ours. Also, it is imperative to have as a reference the table of composition for Ecuadorian food, which, although is not updated, was constructed using the direct method with Ecuadorian foods. Hence, both tables will be used to elaborate a questionnaire with the purpose of assessing the food consumption of the Ecuadorian population. In case of having disparate values, we will proceed by taking just the INCAP values because this is an updated table.

Keywords: Ecuadorian food composition tables, FCT elaborated by direct method, ingest of nutrients of Ecuadorians, Latin America food composition tables

Procedia PDF Downloads 428
11121 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society

Authors: Irene Yi

Abstract:

Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.

Keywords: computational analysis, gendered grammar, misogynistic language, neural networks

Procedia PDF Downloads 113
11120 A Study on Explicitation Strategies Employed in Persian Subtitling of English Crime Movies

Authors: Hossein Heidari Tabrizi, Azizeh Chalak, Hossein Enayat

Abstract:

The present study seeks to investigate the application of expansion strategy in Persian subtitles of English crime movies. More precisely, this study aims at classifying the different types of expansion used in subtitles as well as investigating the appropriateness or inappropriateness of the application of each type. To achieve this end, three English movies; namely, The Net (1995), Contact (1997) and Mission Impossible 2 (2000), available with Persian subtitles, were selected for the study. To collect the data, the above mentioned movies were watched and those parts of the Persian subtitles in which expansion had been used were identified and extracted along with their English dialogs. Then, the extracted Persian subtitles were classified based on the reason that led to expansion in each case. Next, the appropriateness or inappropriateness of using expansion in the extracted Persian subtitles was descriptively investigated. Finally, an equivalent not containing any expansion was proposed for those cases in which the meaning could be fully transferred without this strategy. The findings of the study indicated that the reasons range from explicitation (explicitation of visual, co-textual and contextual information), mistranslation and paraphrasing to the preferences of subtitlers. Furthermore, it was found that the employment of expansion strategy was inappropriate in all cases except for those caused by explicitation of contextual information since correct and shorter equivalents which were equally capable of conveying the intended meaning could be posited for the original dialogs.

Keywords: audiovisual translation, English crime movies, expansion strategies, Persian subtitles

Procedia PDF Downloads 463
11119 Modeling of Tsunami Propagation and Impact on West Vancouver Island, Canada

Authors: S. Chowdhury, A. Corlett

Abstract:

Large tsunamis strike the British Columbia coast every few hundred years. The Cascadia Subduction Zone, which extends along the Pacific coast from Vancouver Island to Northern California is one of the most seismically active regions in Canada. Significant earthquakes have occurred in this region, including the 1700 Cascade Earthquake with an estimated magnitude of 9.2. Based on geological records, experts have predicted a 'great earthquake' of a similar magnitude within this region may happen any time. This earthquake is expected to generate a large tsunami that could impact the coastal communities on Vancouver Island. Since many of these communities are in remote locations, they are more likely to be vulnerable, as the post-earthquake relief efforts would be impacted by the damage to critical road infrastructures. To assess the coastal vulnerability within these communities, a hydrodynamic model has been developed using MIKE-21 software. We have considered a 500 year probabilistic earthquake design criteria including the subsidence in this model. The bathymetry information was collected from Canadian Hydrographic Services (CHS), and National Oceanic Atmospheric and Administration (NOAA). The arial survey was conducted using a Cessna-172 aircraft for the communities, and then the information was converted to generate a topographic digital elevation map. Both survey information was incorporated into the model, and the domain size of the model was about 1000km x 1300km. This model was calibrated with the tsunami occurred off the west coast of Moresby Island on October 28, 2012. The water levels from the model were compared with two tide gauge stations close to the Vancouver Island and the output from the model indicates the satisfactory result. For this study, the design water level was considered as High Water Level plus the Sea Level Rise for 2100 year. The hourly wind speeds from eight directions were collected from different wind stations and used a 200-year return period wind speed in the model for storm events. The regional model was set for 12 hrs simulation period, which takes more than 16 hrs to complete one simulation using double Xeon-E7 CPU computer plus a K-80 GPU. The boundary information for the local model was generated from the regional model. The local model was developed using a high resolution mesh to estimate the coastal flooding for the communities. It was observed from this study that many communities will be effected by the Cascadia tsunami and the inundation maps were developed for the communities. The infrastructures inside the coastal inundation area were identified. Coastal vulnerability planning and resilient design solutions will be implemented to significantly reduce the risk.

Keywords: tsunami, coastal flooding, coastal vulnerable, earthquake, Vancouver, wave propagation

Procedia PDF Downloads 128
11118 Cyber Aggression, Cyber Bullying and the Dark Triad: Effect on Workplace Behavior and Performance

Authors: Anishya Obhrai Madan

Abstract:

In an increasingly connected world, where speed of communication attempts to match the speed of thought and thus intentions; conflict gets actioned faster using media like the internet and telecommunication technology. This has led to a new form of aggression: “cyber bullying”. The present paper attempts to integrate existing theory on bullying, and the dark triad personality traits in a work environment and extrapolate it to the cyber context.

Keywords: conflict at work, cyber bullying, dark triad of personality, toxic employee

Procedia PDF Downloads 224
11117 Online Bakery Management System Proposal

Authors: Alexander Musyoki, Collins Odour

Abstract:

Over the past few years, the bakery industry in Kenya has experienced significant growth largely in part to the increased adoption of technology and automation in their processes; more specifically due to the adoption of bakery management systems to help in running bakeries. While they have been largely responsible for the improved productivity and efficiency in bakeries, most of them are now outdated and pose more challenges than benefits. The proposed online bakery management system mentioned in this paper aims to address this by allowing bakery owners to track inventory, budget, job progress, and data analytics on each job and in doing so, promote the Sustainable Development Goals 3 and 12, which aim to ensure healthy lives and promote sustainable economic growth as the proposed benefits of these features include scalability, easy accessibility, reduced acquisition costs, better reliability, and improved functionality that will allow bakeries to become more competitive, reduce waste and track inventory more efficiently. To better understand the challenges, a comprehensive study has been performed to assess these traditional systems and try to understand if an online bakery management system can prove to be advantageous to bakery owners. The study conducted gathered feedback from bakery owners and employees in Nairobi County, Kenya using an online survey with a response rate of about 86% from the target population. The responses cited complex and hard to use bakery management systems (59.7%), lack of portability from one device to the other (58.1%) and high acquisition costs (51.6%) as the top challenges of traditional bakery management systems. On the other hand, some of the top benefits that most of the respondents would realize from the online bakery management system was better reliability (58.1%) and reduced acquisition costs (58.1%). Overall, the findings suggest that an online bakery management system has a lot of advantages over traditional systems and is likely to be well-received in the market. In conclusion, the proposed online bakery management system has the potential to improve the efficiency and competitiveness of small-sized bakeries in Nairobi County. Further research is recommended to expand the sample size and diversity of respondents and to conduct more in-depth analyses of the data collected.

Keywords: ICT, technology and automation, bakery management systems, food innovation

Procedia PDF Downloads 71