Search results for: open source software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11293

Search results for: open source software

8293 Discursive Psychology of Emotions in Mediation

Authors: Katarzyna Oberda

Abstract:

The aim of this paper is to conceptual emotions in the process of mediation. Although human emotions have been approached from various disciplines and perspectives, e.g. philosophy, linguistics, psychology and neurology, this complex phenomenon still needs further investigation into its discursive character with the an open mind and heart. To attain this aim, the theoretical and practical considerations are taken into account both to contextualize the discursive psychology of emotions in mediation and show how cognitive and linguistic activity expressed in language may lead to the emotional turn in the process of mediation. The double directions of this research into the discursive psychology of emotions have been partially inspired by the evaluative components of mediation forms. In the conducted research, we apply the methodology of discursive psychology with the discourse analysis as a tool. The practical data come from the recorded mediations online. The major findings of the conducted research result in the reconstruction of the emotional transformation model in mediation.

Keywords: discourse analysis, discursive psychology, emotions, mediation

Procedia PDF Downloads 141
8292 Fatigue of Multiscale Nanoreinforced Composites: 3D Modelling

Authors: Leon Mishnaevsky Jr., Gaoming Dai

Abstract:

3D numerical simulations of fatigue damage of multiscale fiber reinforced polymer composites with secondary nanoclay reinforcement are carried out. Macro-micro FE models of the multiscale composites are generated automatically using Python based software. The effect of the nanoclay reinforcement (localized in the fiber/matrix interface (fiber sizing) and distributed throughout the matrix) on the crack path, damage mechanisms and fatigue behavior is investigated in numerical experiments.

Keywords: computational mechanics, fatigue, nanocomposites, composites

Procedia PDF Downloads 589
8291 Enhancing Efficiency of Building through Translucent Concrete

Authors: Humaira Athar, Brajeshwar Singh

Abstract:

Generally, the brightness of the indoor environment of buildings is entirely maintained by the artificial lighting which has consumed a large amount of resources. It is reported that lighting consumes about 19% of the total generated electricity which accounts for about 30-40% of total energy consumption. One possible way is to reduce the lighting energy by exploiting sunlight either through the use of suitable devices or energy efficient materials like translucent concrete. Translucent concrete is one such architectural concrete which allows the passage of natural light as well as artificial light through it. Several attempts have been made on different aspects of translucent concrete such as light guiding materials (glass fibers, plastic fibers, cylinder etc.), concrete mix design and manufacturing methods for use as building elements. Concerns are, however, raised on various related issues such as poor compatibility between the optical fibers and cement paste, unaesthetic appearance due to disturbance occurred in the arrangement of fibers during vibration and high shrinkage in flowable concrete due to its high water/cement ratio. Need is felt to develop translucent concrete to meet the requirement of structural safety as OPC concrete with the maximized saving in energy towards the power of illumination and thermal load in buildings. Translucent concrete was produced using pre-treated plastic optical fibers (POF, 2mm dia.) and high slump white concrete. The concrete mix was proportioned in the ratio of 1:1.9:2.1 with a w/c ratio of 0.40. The POF was varied from 0.8-9 vol.%. The mechanical properties and light transmission of this concrete were determined. Thermal conductivity of samples was measured by a transient plate source technique. Daylight illumination was measured by a lux grid method as per BIS:SP-41. It was found that the compressive strength of translucent concrete increased with decreasing optical fiber content. An increase of ~28% in the compressive strength of concrete was noticed when fiber was pre-treated. FE-SEM images showed little-debonded zone between the fibers and cement paste which was well supported with pull-out bond strength test results (~187% improvement over untreated). The light transmission of concrete was in the range of 3-7% depending on fiber spacing (5-20 mm). The average daylight illuminance (~75 lux) was nearly equivalent to the criteria specified for illumination for circulation (80 lux). The thermal conductivity of translucent concrete was reduced by 28-40% with respect to plain concrete. The thermal load calculated by heat conduction equation was ~16% more than the plain concrete. Based on Design-Builder software, the total annual illumination energy load of a room using one side translucent concrete was 162.36 kW compared with the energy load of 249.75 kW for a room without concrete. The calculated energy saving on an account of the power of illumination was ~25%. A marginal improvement towards thermal comfort was also noticed. It is concluded that the translucent concrete has the advantages of the existing concrete (load bearing) with translucency and insulation characteristics. It saves a significant amount of energy by providing natural daylight instead of artificial power consumption of illumination.

Keywords: energy saving, light transmission, microstructure, plastic optical fibers, translucent concrete

Procedia PDF Downloads 110
8290 Discovering Word-Class Deficits in Persons with Aphasia

Authors: Yashaswini Channabasavegowda, Hema Nagaraj

Abstract:

Aim: The current study aims at discovering word-class deficits concerning the noun-verb ratio in confrontation naming, picture description, and picture-word matching tasks. A total of ten persons with aphasia (PWA) and ten age-matched neurotypical individuals (NTI) were recruited for the study. The research includes both behavioural and objective measures to assess the word class deficits in PWA. Objective: The main objective of the research is to identify word class deficits seen in persons with aphasia, using various speech eliciting tasks. Method: The study was conducted in the L1 of the participants, considered to be Kannada. Action naming test and Boston naming test adapted to the Kannada version are administered to the participants; also, a picture description task is carried out. Picture-word matching task was carried out using e-prime software (version 2) to measure the accuracy and reaction time with respect to identification verbs and nouns. The stimulus was presented through auditory and visual modes. Data were analysed to identify errors noticed in the naming of nouns versus verbs, with respect to the Boston naming test and action naming test and also usage of nouns and verbs in the picture description task. Reaction time and accuracy for picture-word matching were extracted from the software. Results: PWA showed a significant difference in sentence structure compared to age-matched NTI. Also, PWA showed impairment in syntactic measures in the picture description task, with fewer correct grammatical sentences and fewer correct usage of verbs and nouns, and they produced a greater proportion of nouns compared to verbs. PWA had poorer accuracy and lesser reaction time in the picture-word matching task compared to NTI, and accuracy was higher for nouns compared to verbs in PWA. The deficits were noticed irrespective of the cause leading to aphasia.

Keywords: nouns, verbs, aphasia, naming, description

Procedia PDF Downloads 90
8289 Defining Unconventional Hydrocarbon Parameter Using Shale Play Concept

Authors: Rudi Ryacudu, Edi Artono, Gema Wahyudi Purnama

Abstract:

Oil and gas consumption in Indonesia is currently on the rise due to its nation economic improvement. Unfortunately, Indonesia’s domestic oil production cannot meet it’s own consumption and Indonesia has lost its status as Oil and Gas exporter. Even worse, our conventional oil and gas reserve is declining. Unwilling to give up, the government of Indonesia has taken measures to invite investors to invest in domestic oil and gas exploration to find new potential reserve and ultimately increase production. Yet, it has not bear any fruit. Indonesia has taken steps now to explore new unconventional oil and gas play including Shale Gas, Shale Oil and Tight Sands to increase domestic production. These new plays require definite parameters to differentiate each concept. The purpose of this paper is to provide ways in defining unconventional hydrocarbon reservoir parameters in Shale Gas, Shale Oil and Tight Sands. The parameters would serve as an initial baseline for users to perform analysis of unconventional hydrocarbon plays. Some of the on going concerns or question to be answered in regards to unconventional hydrocarbon plays includes: 1. The TOC number, 2. Has it been well “cooked” and become a hydrocarbon, 3. What are the permeability and the porosity values, 4. Does it need a stimulation, 5. Does it has pores, and 6. Does it have sufficient thickness. In contrast with the common oil and gas conventional play, Shale Play assumes that hydrocarbon is retained and trapped in area with very low permeability. In most places in Indonesia, hydrocarbon migrates from source rock to reservoir. From this case, we could derive a theory that Kitchen and Source Rock are located right below the reservoir. It is the starting point for user or engineer to construct basin definition in relation with the tectonic play and depositional environment. Shale Play concept requires definition of characteristic, description and reservoir identification to discover reservoir that is technically and economically possible to develop. These are the steps users and engineers has to do to perform Shale Play: a. Calculate TOC and perform mineralogy analysis using water saturation and porosity value. b. Reconstruct basin that accumulate hydrocarbon c. Brittlenes Index calculated form petrophysical and distributed based on seismic multi attributes d. Integrated natural fracture analysis e. Best location to place a well.

Keywords: unconventional hydrocarbon, shale gas, shale oil tight sand reservoir parameters, shale play

Procedia PDF Downloads 387
8288 Iraqi Media Entrepreneurs across Social Media: Factors and Challenges

Authors: Ahmed Omar Bali, Sherko Jabar, Hazhar Jalal, Mahdi Sofi-Karim

Abstract:

For a long while in Iraq, media organizations were owned by political parties, particularly the ruling parties, because media traditional organizations required big capital and human resources. This paper has examined the dynamics of Iraqi media market transformation with emphasizing on factors that help to merge media entrepreneurs and digital media firms which target audience on social media. A qualitative method was adopted in this study using open, in-depth interviews with 19 media entrepreneurs and three managers of media firms. The study revealed that relative freedom and advanced communication technologies have encouraged media entrepreneurs to drive the new media on producing short videos and broadcast them on social media which has become popular among media consumers.

Keywords: media entrepreneur, Iraq, journalists, media technicians, digital media firms, media market

Procedia PDF Downloads 274
8287 AI for Efficient Geothermal Exploration and Utilization

Authors: Velimir "monty" Vesselinov, Trais Kliplhuis, Hope Jasperson

Abstract:

Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.

Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal

Procedia PDF Downloads 27
8286 Intrapreneurship Discovery: Standard Strategy to Boost Innovation inside Companies

Authors: Chiara Mansanta, Daniela Sani

Abstract:

This paper studies the concept of intrapreneurship discovery for innovation and technology development related to the manufacturing industries set up in the center of Italy, in Marche Region. The study underlined the key drivers of the innovation process and the main factors that influence innovation. Starting from a literature study on open innovation, this paper examines the role of human capital to support company’s development. The empirical part of the study is based on a survey to 151 manufacturing companies that represent the 34% of that universe at the regional level. The survey underlined the main KPI’s that influence companies in their decision processes; then tools for these decision processes are presented.

Keywords: business model, decision making, intrapreneurship discovery, standard methodology

Procedia PDF Downloads 164
8285 Geological Characteristics of the Beni Snouss District

Authors: N. Hadj Mohamed, A. Boutaleb

Abstract:

The Beni Snouss area is characterized by horst and graben structures and it comprises deformed Palaeozoic sedimentary and magmatic rocks overlapping by Mesozoic sediments. Two structural units are distinguished: a Palaeozoic basement and a Mesozoic cover. The study area is densely faulted and major faults strike N110° to N140° and dip vertically The mineralized fault zones are readily distinguishable by their argillic wall rock alteration. The fault zones that are filled with mineralizations, aplites, microgranites and quartz run roughly parallel to each other and are apparently in the same fault system. The Palaeozoic basement rocks contain mineralization occurring as veins, veinlets and disseminations. The Liassic carbonate platform sequence contains Ba (Pb-Zn) sulphide deposits occurring mainly as strata bound, and open space filling.

Keywords: Algeria, basement, Beni Snouss, cover

Procedia PDF Downloads 269
8284 Assessment of Nigerian Newspapers' Reportage of Violence against Children: Case Study of Daily Sun and Punch National Newspapers

Authors: Adline Nkwam-Uwaoma, Mishack Ndukwu

Abstract:

Traditionally, child rearing in Nigeria closely reflects the ‘spare the rod and spoil the child’ maxim and as such spanking, flogging, slapping, beating and even starving a child as a form of punishment for wrongdoing and as a method of behaviour modification are common. These are not necessarily considered as maltreatment or abuse of the child. Despite the adoption and implementation of the child rights act in Nigeria, violence against children seems to be on a steady increase. Stories of sexual molestation, rape, child labour, infliction of physical injuries and use of children for rituals by parents, guardians or other members of the society abound. Violence against children is considered as those acts by other persons especially adults that undermine and threaten the healthy life and existence of children or those that violet their rights as humans. In Nigeria newspapers are a major source of News, second only to radio and television in coverage, currency and content. National dailies are newspapers with daily publications and national spread or coverage. This study analysed the frequency, length, prominence level, direction and sources of information reported on violence against children in the selected national daily newspapers. It then provided information on the role of the newspapers in Nigeria in the fight against child violence and public awareness of the impact of violence against children on the development of the nation and the attempts to curtail such violence. The composite week sampling technique in which the four weeks of the month are reduced to one and a sample is randomly selected from each day of the week was used. As such 168 editions of Daily Sun and Punch newspapers published from January to December of 2016 were selected. Data were collected using code sheet and analyzed via content analysis. The result showed that the frequency of the newspapers’ reportage of violence against children in Nigeria was low. Again, it was found that the length or space given to reports on violence against children was inadequate, the direction of the few reports on violence against children was in favour of the course or fight against child violence, and these newspapers gave no prominence to reports on violence against children. Finally, it was found that a major source of News about violence against children was through journalism; government and individual sources provided only minimal information.

Keywords: children, newspapers' reportage, Nigeria, violence

Procedia PDF Downloads 141
8283 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network

Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour

Abstract:

Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.

Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network

Procedia PDF Downloads 156
8282 Characterization of InP Semiconductor Quantum Dot Laser Diode after Am-Be Neutron Irradiation

Authors: Abdulmalek Marwan Rajkhan, M. S. Al Ghamdi, Mohammed Damoum, Essam Banoqitah

Abstract:

This paper is about the Am-Be neutron source irradiation of the InP Quantum Dot Laser diode. A QD LD was irradiated for 24 hours and 48 hours. The laser underwent IV characterization experiments before and after the first and second irradiations. A computer simulation using GAMOS helped in analyzing the given results from IV curves. The results showed an improvement in the QD LD series resistance, current density, and overall ideality factor at all measured temperatures. This is explained by the activation of the QD LD Indium composition to Strontium, ionization of the compound QD LD materials, and the energy deposited to the QD LD.

Keywords: quantum dot laser diode irradiation, effect of radiation on QD LD, Am-Be irradiation effect on SC QD LD

Procedia PDF Downloads 45
8281 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays

Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín

Abstract:

Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.

Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation

Procedia PDF Downloads 181
8280 Lipid from Activated Sludge as a Feedstock for the Production of Biodiesel

Authors: Ifeanyichukwu Edeh, Tim Overton, Steve Bowra

Abstract:

There is increasing interest in utilising low grade or waste biomass for the production of renewable bioenergy vectors i.e. waste to energy. In this study we have chosen to assess, activated sludge, which is a microbial biomass generated during the second stage of waste water treatment as a source of lipid for biodiesel production. To date a significant proportion of biodiesel is produced from used cooking oil and animal fats. It was reasoned that if activated sludge proved a viable feedstock it has the potential to support increase biodiesel production capacity. Activated sludge was obtained at different times of the year and from two different sewage treatment works in the UK. The biomass within the activated sludge slurry was recovered by filtration and the total weight of material calculated by combining the dry weight of the total suspended solid (TSS) and the total dissolved solid (TDS) fractions. Total lipids were extracted from the TSS and TDS using solvent extraction (Folch methods). The classes of lipids within the total lipid extract were characterised using high performance thin layer chromatography (HPTLC) by referencing known standards. The fatty acid profile and content of the lipid extract were determined using acid mediated-methanolysis to obtain fatty acid methyl esters (FAMEs) which were analysed by gas chromatography and HPTLC. The results showed that there were differences in the total biomass content in the activated sludge collected from different sewage works. Lipid yields from TSS obtained from both sewage treatment works differed according to the time of year (between 3.0 and 7.4 wt. %). The lipid yield varied slightly within the same source of biomass but more widely between the two sewage treatment works. The neutral lipid classes identified were acylglycerols, free fatty acids, sterols and wax esters while the phospholipid class included phosphatidylcholine, lysophosphatidycholine, phosphatidylethanolamine and phosphatidylinositol. The fatty acid profile revealed the presence of palmitic acid, palmitoleic acid, linoleic acid, oleic acid and stearic acid and that unsaturated fatty acids were the most abundant. Following optimisation, the FAME yield was greater than 10 wt. % which was required to have an economic advantage in biodiesel production.

Keywords: activated sludge, biodiesel, lipid, methanolysis

Procedia PDF Downloads 459
8279 Environmental Engineering Case Study of Waste Water Treatement

Authors: Harold Jideofor

Abstract:

Wastewater treatment consists of applying known technology to improve or upgrade the quality of a wastewater. Usually wastewater treatment will involve collecting the wastewater in a central, segregated location (the Wastewater Treatment Plant) and subjecting the wastewater to various treatment processes. Most often, since large volumes of wastewater are involved, treatment processes are carried out on continuously flowing wastewaters (continuous flow or "open" systems) rather than as "batch" or a series of periodic treatment processes in which treatment is carried out on parcels or "batches" of wastewaters. While most wastewater treatment processes are continuous flow, certain operations, such as vacuum filtration, involving storage of sludge, the addition of chemicals, filtration and removal or disposal of the treated sludge, are routinely handled as periodic batch operations.

Keywords: wastewater treatment, environmental engineering, waste water

Procedia PDF Downloads 563
8278 Experimental Networks Synchronization of Chua’s Circuit in Different Topologies

Authors: Manuel Meranza-Castillon, Rolando Diaz-Castillo, Adrian Arellano-Delgado, Cesar Cruz-Hernandez, Rosa Martha Lopez-Gutierrez

Abstract:

In this work, we deal with experimental network synchronization of chaotic nodes with different topologies. Our approach is based on complex system theory, and we use a master-slave configuration to couple the nodes in the networks. In particular, we design and implement electronically complex dynamical networks composed by nine coupled chaotic Chua’s circuits with topologies: in nearest-neighbor, small-world, open ring, star, and global. Also, network synchronization is evaluated according to a particular coupling strength for each topology. This study is important by the possible applications to private transmission of information in a chaotic communication network of multiple users.

Keywords: complex networks, Chua's circuit, experimental synchronization, multiple users

Procedia PDF Downloads 328
8277 Comparison of On-Site Stormwater Detention Policies in Australian and Brazilian Cities

Authors: Pedro P. Drumond, James E. Ball, Priscilla M. Moura, Márcia M. L. P. Coelho

Abstract:

In recent decades, On-site Stormwater Detention (OSD) systems have been implemented in many cities around the world. In Brazil, urban drainage source control policies were created in the 1990’s and were mainly based on OSD. The concept of this technique is to promote the detention of additional stormwater runoff caused by impervious areas, in order to maintain pre-urbanization peak flow levels. In Australia OSD, was first adopted in the early 1980’s by the Ku-ring-gai Council in Sydney’s northern suburbs and Wollongong City Council. Many papers on the topic were published at that time. However, source control techniques related to stormwater quality have become to the forefront and OSD has been relegated to the background. In order to evaluate the effectiveness of the current regulations regarding OSD, the existing policies were compared in Australian cities, a country considered experienced in the use of this technique, and in Brazilian cities where OSD adoption has been increasing. The cities selected for analysis were Wollongong and Belo Horizonte, the first municipalities to adopt OSD in their respective countries, and Sydney and Porto Alegre, cities where these policies are local references. The Australian and Brazilian cities are located in Southern Hemisphere of the planet and similar rainfall intensities can be observed, especially in storm bursts greater than 15 minutes. Regarding technical criteria, Brazilian cities have a site-based approach, analyzing only on-site system drainage. This approach is criticized for not evaluating impacts on urban drainage systems and in rare cases may cause the increase of peak flows downstream. The city of Wollongong and most of the Sydney Councils adopted a catchment-based approach, requiring the use of Permissible Site Discharge (PSD) and Site Storage Requirements (SSR) values based on analysis of entire catchments via hydrograph-producing computer models. Based on the premise that OSD should be designed to dampen storms of 100 years Average Recurrence Interval (ARI) storm, the values of PSD and SSR in these four municipalities were compared. In general, Brazilian cities presented low values of PSD and high values of SSR. This can be explained by site-based approach and the low runoff coefficient value adopted for pre-development conditions. The results clearly show the differences between approaches and methodologies adopted in OSD designs among Brazilian and Australian municipalities, especially with regard to PSD values, being on opposite sides of the scale. However, lack of research regarding the real performance of constructed OSD does not allow for determining which is best. It is necessary to investigate OSD performance in a real situation, assessing the damping provided throughout its useful life, maintenance issues, debris blockage problems and the parameters related to rain-flow methods. Acknowledgments: The authors wish to thank CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico (Chamada Universal – MCTI/CNPq Nº 14/2014), FAPEMIG - Fundação de Amparo à Pesquisa do Estado de Minas Gerais, and CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior for their financial support.

Keywords: on-site stormwater detention, source control, stormwater, urban drainage

Procedia PDF Downloads 166
8276 Conflicts and Complexities: a Study of Hong Kong's Bilingual Street Signs from Functional Perspective on Translation

Authors: Ge Song

Abstract:

Hong Kong’s bilingual street signs declare a kind of correspondence, equivalence and thus translation between the English and Chinese languages. This study finds four translation phenomena among the street signs: domestication with positive connotation, foreignization with negative connotation, bilingual incompatibilities, and cross-street complexities. The interplay of, and the tension between, the four features open up a space where the local and the foreign, the vulgar and the elegant, alternate and experiment with each other, creating a kaleidoscope of methods for expressing and domesticating foreign otherness by virtue of translation. An analysis of the phenomena from the functional perspective reveals how translation has been emancipated to inform a variety of dimensions. This study also renews our understanding of translation as both a concept and a practice.

Keywords: street signs, linguistic landscape, cultural hybridity, Hong Kong

Procedia PDF Downloads 192
8275 Different Cathode Buffer Layers in Organic Solar Cells

Authors: Radia Kamel

Abstract:

Considerable progress has been made in the development of bulk-heterojunction organic solar cells (OSCs) based on a blend of p-type and n-type organic semiconductors. To optimize the interfacial properties between the active layer and the electrode, a cathode buffer layer (CBL) is introduced. This layer can reduce the leakage current, increasing the open-circuit voltage and the fill factor while improving the OSC stability. In this work, the performance of PM6:Y6 OSC with 1-Chloronaphthalene as an additive is examined. To accomplish this, three CBLs PNDIT-F3N-Br, ZrAcac, and PDINO, are compared using the conventional configuration. The device with PNDIT-F3N-Br as CBL exhibits the highest power conversion efficiency of 16.04%. The results demonstrate that modifying the cathode buffer layer is crucial for achieving high-performance OSCs.

Keywords: bulk heterojunction, cathode buffer layer, efficiency, organic solar cells

Procedia PDF Downloads 150
8274 The Digital Transformation of Life Insurance Sales in Iran With the Emergence of Personal Financial Planning Robots; Opportunities and Challenges

Authors: Pedram Saadati, Zahra Nazari

Abstract:

Anticipating and identifying future opportunities and challenges facing industry activists for the emergence and entry of new knowledge and technologies of personal financial planning, and providing practical solutions is one of the goals of this research. For this purpose, a future research tool based on receiving opinions from the main players of the insurance industry has been used. The research method in this study was in 4 stages; including 1- a survey of the specialist salesforce of life insurance in order to identify the variables 2- the ranking of the variables by experts selected by a researcher-made questionnaire 3- holding a panel of experts with the aim of understanding the mutual effects of the variables and 4- statistical analyzes of the mutual effects matrix in Mick Mac software is done. The integrated analysis of influencing variables in the future has been done with the method of Structural Analysis, which is one of the efficient and innovative methods of future research. A list of opportunities and challenges was identified through a survey of best-selling life insurance representatives who were selected by snowball sampling. In order to prioritize and identify the most important issues, all the issues raised were sent to selected experts who were selected theoretically through a researcher-made questionnaire. The respondents determined the importance of 36 variables through scoring, so that the prioritization of opportunity and challenge variables can be determined. 8 of the variables identified in the first stage were removed by selected experts, and finally, the number of variables that could be examined in the third stage became 28 variables, which, in order to facilitate the examination, were divided into 6 categories, respectively, 11 variables of organization and management. Marketing and sales 7 cases, social and cultural 6 cases, technological 2 cases, rebranding 1 case and insurance 1 case were divided. The reliability of the researcher-made questionnaire was confirmed with the Cronbach's alpha test value of 0.96. In the third stage, by forming a panel consisting of 5 insurance industry experts, the consensus of their opinions about the influence of factors on each other and the ranking of variables was entered into the matrix. The matrix included the interrelationships of 28 variables, which were investigated using the structural analysis method. By analyzing the data obtained from the matrix by Mic Mac software, the findings of the research indicate that the categories of "correct training in the use of the software, the weakness of the technology of insurance companies in personalizing products, using the approach of equipping the customer, and honesty in declaring no need Customer to Insurance", the most important challenges of the influencer and the categories of "salesforce equipping approach, product personalization based on customer needs assessment, customer's pleasant experience of being consulted with consulting robots, business improvement of the insurance company due to the use of these tools, increasing the efficiency of the issuance process and optimal customer purchase" were identified as the most important opportunities for influence.

Keywords: personal financial planning, wealth management, advisor robots, life insurance, digital transformation

Procedia PDF Downloads 32
8273 An Improved Transmission Scheme in Cooperative Communication System

Authors: Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

Recently developed cooperative diversity scheme enables a terminal to get transmit diversity through the support of other terminals. However, most of the introduced cooperative schemes have a common fault of decreased transmission rate because the destination should receive the decodable compositions of symbols from the source and the relay. In order to achieve high data rate, we propose a cooperative scheme that employs hierarchical modulation. This scheme is free from the rate loss and allows seamless cooperative communication.

Keywords: cooperative communication, hierarchical modulation, high data rate, transmission scheme

Procedia PDF Downloads 413
8272 Diversifying Nigeria's Economy Using Tourism as a Richer Alternative to Oil

Authors: Aly Audu Fada

Abstract:

The mono-economic structure of Nigerian economy has made it depend on oil for so many years. Apart from the negative effect of its exploitation, relying solely on oil as the major source of revenue for peddling the ship of development is myopic. The crumbling oil price in the world market is one proof of the dangers of this over-dependence. This paper highlights the consequences of the oil-driven economy and explores the various opportunities that are accessible in tourism through a contextual analysis. It is recommended that those at the helm of affairs should initiate collaboration between the public and private sectors to explore and harness the rich tourism resources naturally dispersed across the country to achieve the objectives of economic transformation agenda of the Federal Government.

Keywords: diversifying, economic, tourism, oil

Procedia PDF Downloads 377
8271 Characteristics of Successful Sales Interaction in B2B Sales Meetings

Authors: Ari Alamäki, Timo Kaski

Abstract:

The value of co-creation has gained much attention in sales research, but less is known about how salespeople and customers interact in the authentic business to business (B2B) sales meetings. The study presented in this paper empirically contributes to existing research by presenting authentic B2B sales meetings that were video recorded and analyzed using observation and qualitative content analysis methods. This paper aims to study key elements of successful sales interactions between salespeople and customers/buyers. This study points out that salespeople are selling value rather than the products or services themselves, which are only enablers in realizing business benefits. Therefore, our findings suggest that promoting and easing open discourse is an essential part of a successful sales encounter. A better understanding of how salespeople and customers successfully interact would help salespeople to develop their interpersonal sales skills.

Keywords: personal selling, relationship, sales management, value co-creation

Procedia PDF Downloads 374
8270 Numerical Modeling of the Depth-Averaged Flow over a Hill

Authors: Anna Avramenko, Heikki Haario

Abstract:

This paper reports the development and application of a 2D depth-averaged model. The main goal of this contribution is to apply the depth averaged equations to a wind park model in which the treatment of the geometry, introduced on the mathematical model by the mass and momentum source terms. The depth-averaged model will be used in future to find the optimal position of wind turbines in the wind park. K-E and 2D LES turbulence models were consider in this article. 2D CFD simulations for one hill was done to check the depth-averaged model in practise.

Keywords: depth-averaged equations, numerical modeling, CFD, wind park model

Procedia PDF Downloads 588
8269 Exceptional Cost and Time Optimization with Successful Leak Repair and Restoration of Oil Production: West Kuwait Case Study

Authors: Nasser Al-Azmi, Al-Sabea Salem, Abu-Eida Abdullah, Milan Patra, Mohamed Elyas, Daniel Freile, Larisa Tagarieva

Abstract:

Well intervention was done along with Production Logging Tools (PLT) to detect sources of water, and to check well integrity for two West Kuwait oil wells started to produce 100 % water. For the first well, to detect the source of water, PLT was performed to check the perforations, no production observed from the bottom two perforation intervals, and an intake of water was observed from the top most perforation. Then a decision was taken to extend the PLT survey from tag depth to the Y-tool. For the second well, the aim was to detect the source of water and if there was a leak in the 7’’liner in front of the upper zones. Data could not be recorded in flowing conditions due to the casing deformation at almost 8300 ft. For the first well from the interpretation of PLT and well integrity data, there was a hole in the 9 5/8'' casing from 8468 ft to 8494 ft producing almost the majority of water, which is 2478 bbl/d. The upper perforation from 10812 ft to 10854 ft was taking 534 stb/d. For the second well, there was a hole in the 7’’liner from 8303 ft MD to 8324 ft MD producing 8334.0 stb/d of water with an intake zone from10322.9-10380.8 ft MD taking the whole fluid. To restore the oil production, W/O rig was mobilized to prevent dump flooding, and during the W/O, the leaking interval was confirmed for both wells. The leakage was cement squeezed and tested at 900-psi positive pressure and 500-psi drawdown pressure. The cement squeeze job was successful. After W/O, the wells kept producing for cleaning, and eventually, the WC reduced to 0%. Regular PLT and well integrity logs are required to study well performance, and well integrity issues, proper cement behind casing is essential to well longevity and well integrity, and the presence of the Y-tool is essential as monitoring of well parameters and ESP to facilitate well intervention tasks. Cost and time optimization in oil and gas and especially during rig operations is crucial. PLT data quality and the accuracy of the interpretations contributed a lot to identify the leakage interval accurately and, in turn, saved a lot of time and reduced the repair cost with almost 35 to 45 %. The added value here was more related to the cost reduction and effective and quick proper decision making based on the economic environment.

Keywords: leak, water shut-off, cement, water leak

Procedia PDF Downloads 104
8268 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis

Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith

Abstract:

Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.

Keywords: information technology, electronic patient records, digitisation, paperless healthcare

Procedia PDF Downloads 69
8267 Numerical Simulation of Production of Microspheres from Polymer Emulsion in Microfluidic Device toward Using in Drug Delivery Systems

Authors: Nizar Jawad Hadi, Sajad Abd Alabbas

Abstract:

Because of their ability to encapsulate and release drugs in a controlled manner, microspheres fabricated from polymer emulsions using microfluidic devices have shown promise for drug delivery applications. In this study, the effects of velocity, density, viscosity, and surface tension, as well as channel diameter, on microsphere generation were investigated using Fluent Ansys software. The software was programmed with the physical properties of the polymer emulsion such as density, viscosity and surface tension. Simulation will then be performed to predict fluid flow and microsphere production and improve the design of drug delivery applications based on changes in these parameters. The effects of capillary and Weber numbers are also studied. The results of the study showed that the size of the microspheres can be controlled by adjusting the speed and diameter of the channel. Narrower microspheres resulted from narrower channel widths and higher flow rates, which could improve drug delivery efficiency, while smaller microspheres resulted from lower interfacial surface tension. The viscosity and density of the polymer emulsion significantly affected the size of the microspheres, ith higher viscosities and densities producing smaller microspheres. The loading and drug release properties of the microspheres created with the microfluidic technique were also predicted. The results showed that the microspheres can efficiently encapsulate drugs and release them in a controlled manner over a period of time. This is due to the high surface area to volume ratio of the microspheres, which allows for efficient drug diffusion. The ability to tune the manufacturing process using factors such as speed, density, viscosity, channel diameter, and surface tension offers a potential opportunity to design drug delivery systems with greater efficiency and fewer side effects.

Keywords: polymer emulsion, microspheres, numerical simulation, microfluidic device

Procedia PDF Downloads 53
8266 Optimal Beam for Accelerator Driven Systems

Authors: M. Paraipan, V. M. Javadova, S. I. Tyutyunnikov

Abstract:

The concept of energy amplifier or accelerator driven system (ADS) involves the use of a particle accelerator coupled with a nuclear reactor. The accelerated particle beam generates a supplementary source of neutrons, which allows the subcritical functioning of the reactor, and consequently a safe exploitation. The harder neutron spectrum realized ensures a better incineration of the actinides. The almost generalized opinion is that the optimal beam for ADS is represented by protons with energy around 1 GeV (gigaelectronvolt). In the present work, a systematic analysis of the energy gain for proton beams with energy from 0.5 to 3 GeV and ion beams from deuteron to neon with energies between 0.25 and 2 AGeV is performed. The target is an assembly of metallic U-Pu-Zr fuel rods in a bath of lead-bismuth eutectic coolant. The rods length is 150 cm. A beryllium converter with length 110 cm is used in order to maximize the energy released in the target. The case of a linear accelerator is considered, with a beam intensity of 1.25‧10¹⁶ p/s, and a total accelerator efficiency of 0.18 for proton beam. These values are planned to be achieved in the European Spallation Source project. The energy gain G is calculated as the ratio between the energy released in the target to the energy spent to accelerate the beam. The energy released is obtained through simulation with the code Geant4. The energy spent is calculating by scaling from the data about the accelerator efficiency for the reference particle (proton). The analysis concerns the G values, the net power produce, the accelerator length, and the period between refueling. The optimal energy for proton is 1.5 GeV. At this energy, G reaches a plateau around a value of 8 and a net power production of 120 MW (megawatt). Starting with alpha, ion beams have a higher G than 1.5 GeV protons. A beam of 0.25 AGeV(gigaelectronvolt per nucleon) ⁷Li realizes the same net power production as 1.5 GeV protons, has a G of 15, and needs an accelerator length 2.6 times lower than for protons, representing the best solution for ADS. Beams of ¹⁶O or ²⁰Ne with energy 0.75 AGeV, accelerated in an accelerator with the same length as 1.5 GeV protons produce approximately 900 MW net power, with a gain of 23-25. The study of the evolution of the isotopes composition during irradiation shows that the increase in power production diminishes the period between refueling. For a net power produced of 120 MW, the target can be irradiated approximately 5000 days without refueling, but only 600 days when the net power reaches 1 GW (gigawatt).

Keywords: accelerator driven system, ion beam, electrical power, energy gain

Procedia PDF Downloads 125
8265 A Simple User Administration View of Computing Clusters

Authors: Valeria M. Bastos, Myrian A. Costa, Matheus Ambrozio, Nelson F. F. Ebecken

Abstract:

In this paper a very simple and effective user administration view of computing clusters systems is implemented in order of friendly provide the configuration and monitoring of distributed application executions. The user view, the administrator view, and an internal control module create an illusionary management environment for better system usability. The architecture, properties, performance, and the comparison with others software for cluster management are briefly commented.

Keywords: big data, computing clusters, administration view, user view

Procedia PDF Downloads 311
8264 Deep Learning for SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo Ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring. SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, polarimetric SAR image, convolutional neural network, deep learnig, deep neural network

Procedia PDF Downloads 57