Search results for: geographic information system (GIS)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25169

Search results for: geographic information system (GIS)

18659 Development of Deep Neural Network-Based Strain Values Prediction Models for Full-Scale Reinforced Concrete Frames Using Highly Flexible Sensing Sheets

Authors: Hui Zhang, Sherif Beskhyroun

Abstract:

Structural Health monitoring systems (SHM) are commonly used to identify and assess structural damage. In terms of damage detection, SHM needs to periodically collect data from sensors placed in the structure as damage-sensitive features. This includes abnormal changes caused by the strain field and abnormal symptoms of the structure, such as damage and deterioration. Currently, deploying sensors on a large scale in a building structure is a challenge. In this study, a highly stretchable strain sensors are used in this study to collect data sets of strain generated on the surface of full-size reinforced concrete (RC) frames under extreme cyclic load application. This sensing sheet can be switched freely between the test bending strain and the axial strain to achieve two different configurations. On this basis, the deep neural network prediction model of the frame beam and frame column is established. The training results show that the method can accurately predict the strain value and has good generalization ability. The two deep neural network prediction models will also be deployed in the SHM system in the future as part of the intelligent strain sensor system.

Keywords: strain sensing sheets, deep neural networks, strain measurement, SHM system, RC frames

Procedia PDF Downloads 77
18658 Females’ Usage Patterns of Information and Communication Technologies (ICTs) in the Vhembe District, South Africa

Authors: Fulufhelo Oscar Maphiri-Makananise

Abstract:

The main purpose of this paper is to explore and provide substantiated evidence based on the usage patterns of Information and Communication Technologies (ICTs) by females in the Vhembe District in Limpopo-Province, South Africa. The study presents a broader picture and understanding about the usage of ICTs from female’s perspective. The significance of this study stems from the need to discover the role, relevance and usage patterns of ICTs such as smartphones, computers, laptops, and iPods, internet and social networking sites among females following the trends of new media technologies in the society. The main objective of the study was to investigate the usability and accessibility of ICTs to empower the Vhembe District females in South Africa. The study used quantitative research method together with elements of qualitative research to determine the major ideas, perceptions and usage patterns of ICTs by females in the District. Data collection involved structured and self-administered questionnaire with both closed-ended and open-ended questions. Two groups of respondents participated in this study. Media Studies female students (n=50) at the University of Venda provided their ideas and perceptions about the usefulness and usage patterns of ICTs such as smartphones, internet and computers at the university level, while the second group were (n=50) Makhado comprehensive school learners who also provided their perceptions and ideas about the use of ICTs at the high school level. Also, the study provides a more balanced, accurate and rational results on the pertinent issues that concern the use of ICTs by females in the Vhembe District. The researcher also believes that the findings of the study are useful as a guideline and model for ICT intervention that work as an empowerment to women in South Africa. The study showed that the main purpose of using ICTs by females was to search information for writing assignments, conducting research, dating, exchanging ideas and networking with friends and relatives that are also members of social networking sites and maintaining existing friends in real life. The study further revealed that most females were using ICTs for social purposes and accessing the internet than entertaining themselves. The finding also indicated a high number of females that used ICTs for e-learning (62%) and social purposes (85%). Moreover, the study centred on providing strong insightful information on the females’ usage patterns and their perceptions of ICTs in the Vhembe district of Limpopo province.

Keywords: female users, information and communication technologies, internet, usage patterns

Procedia PDF Downloads 201
18657 A Model of Teacher Leadership in History Instruction

Authors: Poramatdha Chutimant

Abstract:

The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.

Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership

Procedia PDF Downloads 266
18656 The Presence of Anglicisms in Italian Fashion Magazines and Fashion Blogs

Authors: Vivian Orsi

Abstract:

The present research investigates the lexicon of a fashion magazine, whose universe is very receptive to lexical loans, especially those from English, called Anglicisms. Specifically, we intend to discuss the presence of English items and expressions in the Vogue Italia fashion magazine. Besides, we aim to study the anglicisms used in an Italian fashion blog called The Blonde Salad. Within the discussion of fashion blogs and their contributions to scientific studies, we adopt the theories of Lexicology / Lexicography to define Anglicism (BIDERMAN, 2001), and the observation of its prestige in the Italian Language (ROGATO, 2008; BISETTO, 2003). According to the theoretical basis mentioned, we intend to make a brief analysis of the Anglicisms collected from posts of the first year of existence of such fashion blog, emphasizing also the keywords that have the role to encapsulate the content of the text, allowing the reader to retrieve information from the post of the blog. About the use of English in Italian magazines and blogs, we can affirm that it seems to represent sophistication, assuming the value of prerequisite to participate in the fashion centers of the world. Besides, we believe, as Barthes says (1990, p. 215), that “Fashion does not evolve, it changes: its lexicon is new each year, like that of a language which always keeps the same system but suddenly and regularly ‘changes’ the currency of its words”. Fashion is a mode of communication: it is present in man's interaction with the world, which means that such lexical universe is represented according to the particularities of each culture.

Keywords: anglicism, lexicology, magazines, blogs, fashion

Procedia PDF Downloads 318
18655 A Geometric Based Hybrid Approach for Facial Feature Localization

Authors: Priya Saha, Sourav Dey Roy Jr., Debotosh Bhattacharjee, Mita Nasipuri, Barin Kumar De, Mrinal Kanti Bhowmik

Abstract:

Biometric face recognition technology (FRT) has gained a lot of attention due to its extensive variety of applications in both security and non-security perspectives. It has come into view to provide a secure solution in identification and verification of person identity. Although other biometric based methods like fingerprint scans, iris scans are available, FRT is verified as an efficient technology for its user-friendliness and contact freeness. Accurate facial feature localization plays an important role for many facial analysis applications including biometrics and emotion recognition. But, there are certain factors, which make facial feature localization a challenging task. On human face, expressions can be seen from the subtle movements of facial muscles and influenced by internal emotional states. These non-rigid facial movements cause noticeable alterations in locations of facial landmarks, their usual shapes, which sometimes create occlusions in facial feature areas making face recognition as a difficult problem. The paper proposes a new hybrid based technique for automatic landmark detection in both neutral and expressive frontal and near frontal face images. The method uses the concept of thresholding, sequential searching and other image processing techniques for locating the landmark points on the face. Also, a Graphical User Interface (GUI) based software is designed that could automatically detect 16 landmark points around eyes, nose and mouth that are mostly affected by the changes in facial muscles. The proposed system has been tested on widely used JAFFE and Cohn Kanade database. Also, the system is tested on DeitY-TU face database which is created in the Biometrics Laboratory of Tripura University under the research project funded by Department of Electronics & Information Technology, Govt. of India. The performance of the proposed method has been done in terms of error measure and accuracy. The method has detection rate of 98.82% on JAFFE database, 91.27% on Cohn Kanade database and 93.05% on DeitY-TU database. Also, we have done comparative study of our proposed method with other techniques developed by other researchers. This paper will put into focus emotion-oriented systems through AU detection in future based on the located features.

Keywords: biometrics, face recognition, facial landmarks, image processing

Procedia PDF Downloads 397
18654 Addressing the Silent Killer: The Shift in Local Governance to Combat Air Pollution

Authors: Jayati Das

Abstract:

Kolkata, one of the fastest-growing metropolises in India, has been suffering from air pollution for many decades. Mismanagement of government and an increase in automobiles have been fuelling this problem. The study aims to portray the quality of air along with the influence of traffic flow and vehicular growth and the effects on human health. It further shows the correlation between the emission of pollution during weekdays and weekends with the help of a scatter diagram and trend line. An assessment of Kolkata air quality is done where the listed pollutants’ (RPM, SPM, NO2, and SO2) annual average concentrations are classified into four different categories. Our observed association between childhood Acute Respiratory disorder and early life exposure to traffic-related air pollutants is biologically plausible. The period of in utero and the first year of life is critical in the development of the immune and respiratory systems and potentially harmful effects of toxic pollutants during this period might result in the long-lasting impaired capacity to fight infections and increased risk of allergic manifestations. Up-to-date knowledge about the seasonal and spatial variation of asthma and studying the air quality of the area is done through Geographical Information System (GIS). Steps are taken by the government to control air pollution by alternative public transport like the metro and compulsory certification of period-driven vehicles which test for Carbon mono oxide.

Keywords: air pollution, asthma, GIS, hotspots, governance

Procedia PDF Downloads 54
18653 Designing Product-Service-System Applied to Reusable Packaging Solutions: A Strategic Design Tool

Authors: Yuan Long, Fabrizio Ceschin, David Harrison

Abstract:

Environmental sustainability is under the threat of excessive single-use plastic packaging waste, and current waste management fails to address this issue. Therefore, it has led to a reidentification of the alternative, which can curb the packaging waste without reducing social needs. Reusable packaging represents a circular approach to close the loop of consumption in which packaging can stay longer in the system to satisfy social needs. However, the implementation of reusable packaging is fragmented and lacks systematic approaches. The product-service system (PSS) is widely regarded as a sustainable business model innovation for embracing circular consumption. As a result, applying PSS to reusable packaging solutions will be promising to address the packaging waste issue. This paper aims at filling the knowledge gap relating to apply PSS to reusable packaging solutions and provide a strategic design tool that could support packaging professionals to design reusable packaging solutions. The methodology of this paper is case studies and workshops to provide a design tool. The respondents are packaging professionals who are packaging consultants, NGO professionals, and entrepreneurs. 57 cases collected show that 15 archetypal models operate in the market. Subsequently, a polarity diagram is developed to embrace those 15 archetypal models, and a total number of 24 experts were invited for the workshop to evaluate the design tool. This research finally provides a strategic design tool to support packaging professionals to design reusable packaging solutions. The application of the tool is to support the understanding of the reusable packaging solutions, analyzing the markets, identifying new opportunities, and generate new business models. The implication of this research is to provide insights for academics and businesses in terms of tackling single-use packaging waste and build a foundation for further development of the reusable packaging solution tool.

Keywords: environmental sustainability, product-service system, reusable packaging, design tool

Procedia PDF Downloads 138
18652 Screening of Factors Affecting the Enzymatic Hydrolysis of Empty Fruit Bunches in Aqueous Ionic Liquid and Locally Produced Cellulase System

Authors: Md. Z. Alam, Amal A. Elgharbawy, Muhammad Moniruzzaman, Nassereldeen A. Kabbashi, Parveen Jamal

Abstract:

The enzymatic hydrolysis of lignocellulosic biomass is one of the obstacles in the process of sugar production, due to the presence of lignin that protects the cellulose molecules against cellulases. Although the pretreatment of lignocellulose in ionic liquid (IL) system has been receiving a lot of interest; however, it requires IL removal with an anti-solvent in order to proceed with the enzymatic hydrolysis. At this point, introducing a compatible cellulase enzyme seems more efficient in this process. A cellulase enzyme that was produced by Trichoderma reesei on palm kernel cake (PKC) exhibited a promising stability in several ILs. The enzyme called PKC-Cel was tested for its optimum pH and temperature as well as its molecular weight. One among evaluated ILs, 1,3-diethylimidazolium dimethyl phosphate [DEMIM] DMP was applied in this study. Evaluation of six factors was executed in Stat-Ease Design Expert V.9, definitive screening design, which are IL/ buffer ratio, temperature, hydrolysis retention time, biomass loading, cellulase loading and empty fruit bunches (EFB) particle size. According to the obtained data, IL-enzyme system shows the highest sugar concentration at 70 °C, 27 hours, 10% IL-buffer, 35% biomass loading, 60 Units/g cellulase and 200 μm particle size. As concluded from the obtained data, not only the PKC-Cel was stable in the presence of the IL, also it was actually stable at a higher temperature than its optimum one. The reducing sugar obtained was 53.468±4.58 g/L which was equivalent to 0.3055 g reducing sugar/g EFB. This approach opens an insight for more studies in order to understand the actual effect of ILs on cellulases and their interactions in the aqueous system. It could also benefit in an efficient production of bioethanol from lignocellulosic biomass.

Keywords: cellulase, hydrolysis, lignocellulose, pretreatment

Procedia PDF Downloads 352
18651 Fabrication of Coatable Polarizer by Guest-Host System for Flexible Display Applications

Authors: Rui He, Seung-Eun Baik, Min-Jae Lee, Myong-Hoon Lee

Abstract:

The polarizer is one of the most essential optical elements in LCDs. Currently, the most widely used polarizers for LCD is the derivatives of the H-sheet polarizer. There is a need for coatable polarizers which are much thinner and more stable than H-sheet polarizers. One possible approach to obtain thin, stable, and coatable polarizers is based on the use of highly ordered guest-host system. In our research, we aimed to fabricate coatable polarizer based on highly ordered liquid crystalline monomer and dichroic dye ‘guest-host’ system, in which the anisotropic absorption of light could be achieved by aligning a dichroic dye (guest) in the cooperative motion of the ordered liquid crystal (host) molecules. Firstly, we designed and synthesized a new reactive liquid crystalline monomer containing polymerizable acrylate groups as the ‘host’ material. The structure was confirmed by 1H-NMR and IR spectroscopy. The liquid crystalline behavior was studied by differential scanning calorimetry (DSC) and polarized optical microscopy (POM). It was confirmed that the monomers possess highly ordered smectic phase at relatively low temperature. Then, the photocurable ‘guest-host’ system was prepared by mixing the liquid crystalline monomer, dichroic dye and photoinitiator. Coatable polarizers were fabricated by spin-coating above mixture on a substrate with alignment layer. The in-situ photopolymerization was carried out at room temperature by irradiating UV light, resulting in the formation of crosslinked structure that stabilized the aligned dichroic dye molecules. Finally, the dichroic ratio (DR), order parameter (S) and polarization efficiency (PE) were determined by polarized UV/Vis spectroscopy. We prepared the coatable polarizers by using different type of dichroic dyes to meet the requirement of display application. The results reveal that the coatable polarizers at a thickness of 8μm exhibited DR=12~17 and relatively high PE (>96%) with the highest PE=99.3%, which possess potential for the LCD or flexible display applications.

Keywords: coatable polarizer, display, guest-host, liquid crystal

Procedia PDF Downloads 243
18650 Design of a Virtual Instrument (VI) System for Earth Resistivity Survey

Authors: Henry Okoh, Obaro Verisa Omayuli, Gladys A. Osagie

Abstract:

One of the challenges of developing nations is the dearth of measurement devices. Aside the shortage, when available, they are either old or obsolete and also very expensive. When this is the situation, researchers must design alternative systems to help meet the desired needs of academia. This paper presents a design of cost-effective multi-disciplinary virtual instrument system for scientific research. This design was based on NI USB-6255 multifunctional DAQ which was used for earth resistivity measurement in Schlumberger array and the result obtained compared closely with that of a conventional ABEM Terrameter. This instrument design provided a hands-on experience as related to full-waveform signal acquisition in the field.

Keywords: cost-effective, data acquisition (DAQ), full-waveform, multi-disciplinary, Schlumberger array, virtual Instrumentation (VI).

Procedia PDF Downloads 453
18649 The Methodology of Out-Migration in Georgia

Authors: Shorena Tsiklauri

Abstract:

Out-migration is an important issue for Georgia as well as since independence has loosed due to emigration one fifth of its population. During Soviet time out-migration from USSR was almost impossible and one of the most important instruments in regulating population movement within the Soviet Union was the system of compulsory residential registrations, so-called “propiska”. Since independent here was not any regulation for migration from Georgia. The majorities of Georgian migrants go abroad by tourist visa and then overstay, becoming the irregular labor migrants. The official statistics on migration published for this period was based on the administrative system of population registration, were insignificant in terms of numbers and did not represent the real scope of these migration movements. This paper discusses the data quality and methodology of migration statistics in Georgia and we are going to answer the questions: what is the real reason of increasing immigration flows according to the official numbers since 2000s?

Keywords: data quality, Georgia, methodology, migration

Procedia PDF Downloads 400
18648 Sliding Mode Control of an Internet Teleoperated PUMA 600 Robot

Authors: Abdallah Ghoul, Bachir Ouamri, Ismail Khalil Bousserhane

Abstract:

In this paper, we have developed a sliding mode controller for PUMA 600 manipulator robot, to control the remote robot a teleoperation system was developed. This system includes two sites, local and remote. The sliding mode controller is installed at the remote site. The client asks for a position through an interface and receives the real positions after running of the task by the remote robot. Both sites are interconnected via the Internet. In order to verify the effectiveness of the sliding mode controller, that is compared with a classic PID controller. The developed approach is tested on a virtual robot. The results confirmed the high performance of this approach.

Keywords: internet, manipulator robot, PID controller, remote control, sliding mode, teleoperation

Procedia PDF Downloads 309
18647 Urban Rail Transit CBTC Computer Interlocking Subsystem Relying on Multi-Template Pen Point Tracking Algorithm

Authors: Xinli Chen, Xue Su

Abstract:

In the urban rail transit CBTC system, interlocking is considered one of the most basic sys-tems, which has the characteristics of logical complexity and high-security requirements. The development and verification of traditional interlocking subsystems are entirely manual pro-cesses and rely too much on the designer, which often hides many uncertain factors. In order to solve this problem, this article is based on the multi-template nib tracking algorithm for model construction and verification, achieving the main safety attributes and using SCADE for formal verification. Experimental results show that this method helps to improve the quality and efficiency of interlocking software.

Keywords: computer interlocking subsystem, penpoint tracking, communication-based train control system, multi-template tip tracking

Procedia PDF Downloads 143
18646 Personalized Infectious Disease Risk Prediction System: A Knowledge Model

Authors: Retno A. Vinarti, Lucy M. Hederman

Abstract:

This research describes a knowledge model for a system which give personalized alert to users about infectious disease risks in the context of weather, location and time. The knowledge model is based on established epidemiological concepts augmented by information gleaned from infection-related data repositories. The existing disease risk prediction research has more focuses on utilizing raw historical data and yield seasonal patterns of infectious disease risk emergence. This research incorporates both data and epidemiological concepts gathered from Atlas of Human Infectious Disease (AHID) and Centre of Disease Control (CDC) as basic reasoning of infectious disease risk prediction. Using CommonKADS methodology, the disease risk prediction task is an assignment synthetic task, starting from knowledge identification through specification, refinement to implementation. First, knowledge is gathered from AHID primarily from the epidemiology and risk group chapters for each infectious disease. The result of this stage is five major elements (Person, Infectious Disease, Weather, Location and Time) and their properties. At the knowledge specification stage, the initial tree model of each element and detailed relationships are produced. This research also includes a validation step as part of knowledge refinement: on the basis that the best model is formed using the most common features, Frequency-based Selection (FBS) is applied. The portion of the Infectious Disease risk model relating to Person comes out strongest, with Location next, and Weather weaker. For Person attribute, Age is the strongest, Activity and Habits are moderate, and Blood type is weakest. At the Location attribute, General category (e.g. continents, region, country, and island) results much stronger than Specific category (i.e. terrain feature). For Weather attribute, Less Precise category (i.e. season) comes out stronger than Precise category (i.e. exact temperature or humidity interval). However, given that some infectious diseases are significantly more serious than others, a frequency based metric may not be appropriate. Future work will incorporate epidemiological measurements of disease seriousness (e.g. odds ratio, hazard ratio and fatality rate) into the validation metrics. This research is limited to modelling existing knowledge about epidemiology and chain of infection concepts. Further step, verification in knowledge refinement stage, might cause some minor changes on the shape of tree.

Keywords: epidemiology, knowledge modelling, infectious disease, prediction, risk

Procedia PDF Downloads 225
18645 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data

Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah

Abstract:

At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.

Keywords: Semantic Web, linked open data, database, statistic

Procedia PDF Downloads 166
18644 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and ECG-based systems are unquestionably the best choice due to their appealing inherent characteristics. The CNNs are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the calibre of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest FAR of 0.04 percent and the highest FRR of 5%, the best performing network achieved an identification accuracy of 99.94 percent. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, Dense Networks, Identification Rate, Train/Test split ratio

Procedia PDF Downloads 150
18643 Teachers’ Experiences regarding Use of Information and Communication Technology for Visually Impaired Students

Authors: Zikra Faiz, Zaheer Asghar, Nisar Abid

Abstract:

Information and Communication Technologies (ICTs) includes computers, the Internet, and electronic delivery systems such as televisions, radios, multimedia, and overhead projectors etc. In the modern world, ICTs is considered as an essential element of the teaching-learning process. The study was aimed to discover the usage of ICTs in Special Education Institutions for Visually Impaired students, Lahore, Pakistan. Objectives of the study were to explore the problems faced by teachers while using ICT in the classroom. The study was phenomenology in nature; a qualitative survey method was used through a semi-structured interview protocol developed by the researchers. The sample comprised of eighty faculty members selected through a purposive sampling technique. Data were analyzed through thematic analysis technique with the help of open coding. The study findings revealed that multimedia, projectors, computers, laptops and LEDs are used in special education institutes to enhance the teaching-learning process. Teachers believed that ICTs could enhance the knowledge of visually impaired students and every student should use these technologies in the classroom. It was concluded that multimedia, projectors and laptops are used in classroom by teachers and students. ICTs can promote effectively through the training of teachers and students. It was suggested that the government should take steps to enhance ICTs in teacher training and other institutions by pre-service and in-service training of teachers.

Keywords: information and communication technologies, in-services teachers, special education institutions

Procedia PDF Downloads 110
18642 Dynamic Test for Stability of Columns in Sway Mode

Authors: Elia Efraim, Boris Blostotsky

Abstract:

Testing of columns in sway mode is performed in order to determine the maximal allowable load limited by plastic deformations or their end connections and a critical load limited by columns stability. Motivation to determine accurate value of critical force is caused by its using as follow: - critical load is maximal allowable load for given column configuration and can be used as criterion of perfection; - it is used in calculation prescribed by standards for design of structural elements under combined action of compression and bending; - it is used for verification of theoretical analysis of stability at various end conditions of columns. In the present work a new non-destructive method for determination of columns critical buckling load in sway mode is proposed. The method allows performing measurements during the tests under loads that exceeds the columns critical load without losing its stability. The possibility of such loading is achieved by structure of the loading system. The system is performed as frame with rigid girder, one of the columns is the tested column and the other is additional two-hinged strut. Loading of the frame is carried out by the flexible traction element attached to the girder. The load applied on the tested column can achieve values that exceed the critical load by choice of parameters of the traction element and the additional strut. The system lateral stiffness and the column critical load are obtained by the dynamic method. The experiment planning and the comparison between the experimental and theoretical values were performed based on the developed dependency of lateral stiffness of the system on vertical load, taking into account semi-rigid connections of the column's ends. The agreement between the obtained results was established. The method can be used for testing of real full-size columns in industrial conditions.

Keywords: buckling, columns, dynamic method, end-fixity factor, sway mode

Procedia PDF Downloads 340
18641 Building and Development of the Stock Market Institutional Infrastructure in Russia

Authors: Irina Bondarenko, Olga Vandina

Abstract:

The theory of evolutionary economics is the basis for preparation and application of methods forming the stock market infrastructure development concept. The authors believe that the basis for the process of formation and development of the stock market model infrastructure in Russia is the theory of large systems. This theory considers the financial market infrastructure as a whole on the basis of macroeconomic approach with the further definition of its aims and objectives. Evaluation of the prospects for interaction of securities market institutions will enable identifying the problems associated with the development of this system. The interaction of elements of the stock market infrastructure allows to reduce the costs and time of transactions, thereby freeing up resources of market participants for more efficient operation. Thus, methodology of the transaction analysis allows to determine the financial infrastructure as a set of specialized institutions that form a modern quasi-stable system. The financial infrastructure, based on international standards, should include trading systems, regulatory and supervisory bodies, rating agencies, settlement, clearing and depository organizations. Distribution of financial assets, reducing the magnitude of transaction costs, increased transparency of the market are promising tasks in the solution for questions of services level and quality increase provided by institutions of the securities market financial infrastructure. In order to improve the efficiency of the regulatory system, it is necessary to provide "standards" for all market participants. The development of a clear regulation for the barrier to the stock market entry and exit, provision of conditions for the development and implementation of new laws regulating the activities of participants in the securities market, as well as formulation of proposals aimed at minimizing risks and costs, will enable the achievement of positive results. The latter will be manifested in increasing the level of market participant security and, accordingly, the attractiveness of this market for investors and issuers.

Keywords: institutional infrastructure, financial assets, regulatory system, stock market, transparency of the market

Procedia PDF Downloads 122
18640 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 418
18639 Correlation Analysis between Sensory Processing Sensitivity (SPS), Meares-Irlen Syndrome (MIS) and Dyslexia

Authors: Kaaryn M. Cater

Abstract:

Students with sensory processing sensitivity (SPS), Meares-Irlen Syndrome (MIS) and dyslexia can become overwhelmed and struggle to thrive in traditional tertiary learning environments. An estimated 50% of tertiary students who disclose learning related issues are dyslexic. This study explores the relationship between SPS, MIS and dyslexia. Baseline measures will be analysed to establish any correlation between these three minority methods of information processing. SPS is an innate sensitivity trait found in 15-20% of the population and has been identified in over 100 species of animals. Humans with SPS are referred to as Highly Sensitive People (HSP) and the measure of HSP is a 27 point self-test known as the Highly Sensitive Person Scale (HSPS). A 2016 study conducted by the author established base-line data for HSP students in a tertiary institution in New Zealand. The results of the study showed that all participating HSP students believed the knowledge of SPS to be life-changing and useful in managing life and study, in addition, they believed that all tutors and in-coming students should be given information on SPS. MIS is a visual processing and perception disorder that is found in approximately 10% of the population and has a variety of symptoms including visual fatigue, headaches and nausea. One way to ease some of these symptoms is through the use of colored lenses or overlays. Dyslexia is a complex phonological based information processing variation present in approximately 10% of the population. An estimated 50% of dyslexics are thought to have MIS. The study exploring possible correlations between these minority forms of information processing is due to begin in February 2017. An invitation will be extended to all first year students enrolled in degree programmes across all faculties and schools within the institution. An estimated 900 students will be eligible to participate in the study. Participants will be asked to complete a battery of on-line questionnaires including the Highly Sensitive Person Scale, the International Dyslexia Association adult self-assessment and the adapted Irlen indicator. All three scales have been used extensively in literature and have been validated among many populations. All participants whose score on any (or some) of the three questionnaires suggest a minority method of information processing will receive an invitation to meet with a learning advisor, and given access to counselling services if they choose. Meeting with a learning advisor is not mandatory, and some participants may choose not to receive help. Data will be collected using the Question Pro platform and base-line data will be analysed using correlation and regression analysis to identify relationships and predictors between SPS, MIS and dyslexia. This study forms part of a larger three year longitudinal study and participants will be required to complete questionnaires at annual intervals in subsequent years of the study until completion of (or withdrawal from) their degree. At these data collection points, participants will be questioned on any additional support received relating to their minority method(s) of information processing. Data from this study will be available by April 2017.

Keywords: dyslexia, highly sensitive person (HSP), Meares-Irlen Syndrome (MIS), minority forms of information processing, sensory processing sensitivity (SPS)

Procedia PDF Downloads 218
18638 Study of Mixing Conditions for Different Endothelial Dysfunction in Arteriosclerosis

Authors: Sara Segura, Diego Nuñez, Miryam Villamil

Abstract:

In this work, we studied the microscale interaction of foreign substances with blood inside an artificial transparent artery system that represents medium and small muscular arteries. This artery system had channels ranging from 75 μm to 930 μm and was fabricated using glass and transparent polymer blends like Phenylbis(2,4,6-trimethylbenzoyl) phosphine oxide, Poly(ethylene glycol) and PDMS in order to be monitored in real time. The setup was performed using a computer controlled precision micropump and a high resolution optical microscope capable of tracking fluids at fast capture. Observation and analysis were performed using a real time software that reconstructs the fluid dynamics determining the flux velocity, injection dependency, turbulence and rheology. All experiments were carried out with fully computer controlled equipment. Interactions between substances like water, serum (0.9% sodium chloride and electrolyte with a ratio of 4 ppm) and blood cells were studied at microscale as high as 400nm of resolution and the analysis was performed using a frame-by-frame observation and HD-video capture. These observations lead us to understand the fluid and mixing behavior of the interest substance in the blood stream and to shed a light on the use of implantable devices for drug delivery at arteries with different Endothelial dysfunction. Several substances were tested using the artificial artery system. Initially, Milli-Q water was used as a control substance for the study of the basic fluid dynamics of the artificial artery system. However, serum and other low viscous substances were pumped into the system with the presence of other liquids to study the mixing profiles and behaviors. Finally, mammal blood was used for the final test while serum was injected. Different flow conditions, pumping rates, and time rates were evaluated for the determination of the optimal mixing conditions. Our results suggested the use of a very fine controlled microinjection for better mixing profiles with and approximately rate of 135.000 μm3/s for the administration of drugs inside arteries.

Keywords: artificial artery, drug delivery, microfluidics dynamics, arteriosclerosis

Procedia PDF Downloads 268
18637 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 116
18636 Digitalize or Die-Responsible Innovations in Healthcare and Welfare Sectors

Authors: T. Iakovleva

Abstract:

Present paper suggests a theoretical model that describes the process of the development of responsible innovations on the firm level in health and welfare sectors. There is a need to develop new firm strategies in these sectors. This paper suggests to look on the concept of responsible innovation that was originally developed on the social level and to apply this new concept to the new area of firm strategy. The rapid global diffusion of information and communication technologies has greatly improved access to knowledge. At the same time, communication is cheap, information is a commodity, and global trade increases technological diffusion. As a result, firms and users, including those outside of industrialized nations, get early exposure to the latest technologies and information. General-purpose technologies such as mobile phones and 3D printers enable individuals to solve local needs and customize products. The combined effect of these changes is having a profound impact on the innovation landscape. Meanwhile, the healthcare sector is facing unprecedented challenges, which are magnified by budgetary constraints, an aging population and the desire to provide care for all. On the other hand, patients themselves are changing. They are savvier about their diseases, they expect their relation with the healthcare professionals to be open and interactive, but above all they want to be part of the decision process. All of this is a reflection of what is already happening in other industries where customers have access to large amount of information and became educated buyers. This article addresses the question of how ICT research and innovation may contribute to developing solutions to grand societal challenges in a responsible way. A broad definition of the concept of responsibility in the context of innovation is adopted in this paper. Responsibility is thus seen as a collective, uncertain and future-oriented activity. This opens the questions of how responsibilities are perceived and distributed and how innovation and science can be governed and stewarded towards socially desirable and acceptable ends. This article addresses a central question confronting politicians, business leaders, and regional planners.

Keywords: responsible innovation, ICT, healthcare, welfare sector

Procedia PDF Downloads 184
18635 Outdoor Visible Light Communication Channel Modeling under Fog and Smoke Conditions

Authors: Véronique Georlette, Sebastien Bette, Sylvain Brohez, Nicolas Point, Veronique Moeyaert

Abstract:

Visible light communication (VLC) is a communication technology that is part of the optical wireless communication (OWC) family. It uses the visible and infrared spectrums to send data. For now, this technology has widely been studied for indoor use-cases, but it is sufficiently mature nowadays to consider the outdoor environment potentials. The main outdoor challenges are the meteorological conditions and the presence of smoke due to fire or pollutants in urban areas. This paper proposes a methodology to assess the robustness of an outdoor VLC system given the outdoor conditions. This methodology is put into practice in two realistic scenarios, a VLC bus stop, and a VLC streetlight. The methodology consists of computing the power margin available in the system, given all the characteristics of the VLC system and its surroundings. This is done thanks to an outdoor VLC communication channel simulator developed in Python. This simulator is able to quantify the effects of fog and smoke thanks to models taken from environmental and fire engineering scientific literature as well as the optical power reaching the receiver. These two phenomena impact the communication by increasing the total attenuation of the medium. The main conclusion drawn in this paper is that the levels of attenuation due to fog and smoke are in the same order of magnitude. The attenuation of fog being the highest under the visibility of 1 km. This gives a promising prospect for the deployment of outdoor VLC uses-cases in the near future.

Keywords: channel modeling, fog modeling, meteorological conditions, optical wireless communication, smoke modeling, visible light communication

Procedia PDF Downloads 133
18634 Formation of Convergence Culture in the Framework of Conventional Media and New Media

Authors: Berkay Buluş, Aytekin İşman, Kübra Yüzüncüyıl

Abstract:

Developments in media and communication technologies have changed the way we use media. The importance of convergence culture has been increasing day by day within the framework of these developments. With new media, it is possible to say that social networks are the most powerful platforms that are integrated to this digitalization process. Although social networks seem like the place that people can socialize, they can also be utilized as places of production. On the other hand, audience has become users within the framework of transformation from national to global broadcasting. User generated contents make conventional media and new media collide. In this study, these communication platforms will be examined not as platforms that replace one another but mediums that unify each other. In the light of this information, information that is produced by users regarding new media platforms and all new media use practices are called convergence culture. In other words, convergence culture means intersections of conventional and new media. In this study, examples of convergence culture will be analyzed in detail.

Keywords: new media, convergence culture, convergence, use of new media, user generated content

Procedia PDF Downloads 250
18633 Floodnet: Classification for Post Flood Scene with a High-Resolution Aerial Imaginary Dataset

Authors: Molakala Mourya Vardhan Reddy, Kandimala Revanth, Koduru Sumanth, Beena B. M.

Abstract:

Emergency response and recovery operations are severely hampered by natural catastrophes, especially floods. Understanding post-flood scenarios is essential to disaster management because it facilitates quick evaluation and decision-making. To this end, we introduce FloodNet, a brand-new high-resolution aerial picture collection created especially for comprehending post-flood scenes. A varied collection of excellent aerial photos taken during and after flood occurrences make up FloodNet, which offers comprehensive representations of flooded landscapes, damaged infrastructure, and changed topographies. The dataset provides a thorough resource for training and assessing computer vision models designed to handle the complexity of post-flood scenarios, including a variety of environmental conditions and geographic regions. Pixel-level semantic segmentation masks are used to label the pictures in FloodNet, allowing for a more detailed examination of flood-related characteristics, including debris, water bodies, and damaged structures. Furthermore, temporal and positional metadata improve the dataset's usefulness for longitudinal research and spatiotemporal analysis. For activities like flood extent mapping, damage assessment, and infrastructure recovery projection, we provide baseline standards and evaluation metrics to promote research and development in the field of post-flood scene comprehension. By integrating FloodNet into machine learning pipelines, it will be easier to create reliable algorithms that will help politicians, urban planners, and first responders make choices both before and after floods. The goal of the FloodNet dataset is to support advances in computer vision, remote sensing, and disaster response technologies by providing a useful resource for researchers. FloodNet helps to create creative solutions for boosting communities' resilience in the face of natural catastrophes by tackling the particular problems presented by post-flood situations.

Keywords: image classification, segmentation, computer vision, nature disaster, unmanned arial vehicle(UAV), machine learning.

Procedia PDF Downloads 55
18632 An Investigation of Interdisciplinary Techniques for Assessment of Water Quality in an Industrial Area

Authors: Priti Saha, Biswajit Paul

Abstract:

Rapid urbanization and industrialization have increased the demand of groundwater. However, the present era has evident an enormous level of groundwater pollution. Therefore, water quality assessment is paramount importance to evaluate its suitability for drinking, irrigation and industrial use. This study focus to evaluate the groundwater quality of an industrial city in eastern India through interdisciplinary techniques. The multi-purpose Water Quality Index (WQI) assess the suitability for drinking as well as irrigation of forty sampling locations, where 2.5% and 15% of sampling locations have excellent water quality (WQI:0-25) as well as 15% and 40% have good quality (WQI:25-50), which represents its suitability for drinking and irrigation respectively. However, the industrial water quality was assessed through Ryznar Stability Index (LSI), which affirmed that only 2.5% of sampling locations have neither corrosive nor scale forming properties (RSI: 6.2-6.8). These techniques with the integration of geographical information system (GIS) for spatial assessment indorsed its effectiveness to identify the regions where the water bodies are suitable to use for drinking, irrigation as well as industrial activities. Further, the sources of these contaminants were identified through factor analysis (FA), which revealed that both the geogenic as well as anthropogenic sources were responsible for groundwater pollution. This research demonstrates the effectiveness of statistical and GIS techniques for the analysis of environmental contaminants.

Keywords: groundwater, water quality analysis, water quality index, WQI, factor analysis, FA, spatial assessment

Procedia PDF Downloads 181
18631 Electronically Controlled Motorized Steering System (E-Mo Steer)

Authors: M. Prasanth, V. Nithin, R. Keerthana, S.Kalyani

Abstract:

In the current scenario, the steering system in automobiles is such that the motion from the steering wheel is transferred to driving wheel by mechanical linkages. In this paper, we propose a method to design a steering mechanism using servomotors to turn the wheels instead of linkages. In this method, a steering angle sensor senses the turn angle of the steering wheel and its output is processed by an electronical control module. Then the ECM compares the angle value to that of a standard value from a look-up database. Then it gives the appropriate input power and the turning duration to the motors. Correspondingly, the motors turn the wheels by means of bevel gears welded to both the motor output shafts and the wheel hubs. Thus, the wheels are turned without the complicated framework of linkages, reducing the driver’s effort and fatigue considerably.

Keywords: electronic control unit, linkage-less steering, servomotors, E-Mo Steer

Procedia PDF Downloads 249
18630 Numerical Solution to Coupled Heat and Moisture Diffusion in Bio-Sourced Composite Materials

Authors: Mnasri Faiza, El Ganaoui Mohammed, Khelifa Mourad, Gabsi Slimane

Abstract:

The main objective of this paper is to describe the hydrothermal behavior through porous material of construction due to temperature gradient. The construction proposed a bi-layer structure which composed of two different materials. The first is a bio-sourced panel named IBS-AKU (inertia system building), the second is the Neopor material. This system (IBS-AKU Neopor) is developed by a Belgium company (Isohabitat). The study suggests a multi-layer structure of the IBS-AKU panel in one dimension. A numerical method was proposed afterwards, by using the finite element method and a refined mesh area to strong gradients. The evolution of temperature fields and the moisture content has been processed.

Keywords: heat transfer, moisture diffusion, porous media, composite IBS-AKU, simulation

Procedia PDF Downloads 494