Search results for: traditional products
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8946

Search results for: traditional products

486 A New Model to Perform Preliminary Evaluations of Complex Systems for the Production of Energy for Buildings: Case Study

Authors: Roberto de Lieto Vollaro, Emanuele de Lieto Vollaro, Gianluca Coltrinari

Abstract:

The building sector is responsible, in many industrialized countries, for about 40% of the total energy requirements, so it seems necessary to devote some efforts in this area in order to achieve a significant reduction of energy consumption and of greenhouse gases emissions. The paper presents a study aiming at providing a design methodology able to identify the best configuration of the system building/plant, from a technical, economic and environmentally point of view. Normally, the classical approach involves a building's energy loads analysis under steady state conditions, and subsequent selection of measures aimed at improving the energy performance, based on previous experience made by architects and engineers in the design team. Instead, the proposed approach uses a sequence of two well known scientifically validated calculation methods (TRNSYS and RETScreen), that allow quite a detailed feasibility analysis. To assess the validity of the calculation model, an existing, historical building in Central Italy, that will be the object of restoration and preservative redevelopment, was selected as a case-study. The building is made of a basement and three floors, with a total floor area of about 3,000 square meters. The first step has been the determination of the heating and cooling energy loads of the building in a dynamic regime by means of TRNSYS, which allows to simulate the real energy needs of the building in function of its use. Traditional methodologies, based as they are on steady-state conditions, cannot faithfully reproduce the effects of varying climatic conditions and of inertial properties of the structure. With TRNSYS it is possible to obtain quite accurate and reliable results, that allow to identify effective combinations building-HVAC system. The second step has consisted of using output data obtained with TRNSYS as input to the calculation model RETScreen, which enables to compare different system configurations from the energy, environmental and financial point of view, with an analysis of investment, and operation and maintenance costs, so allowing to determine the economic benefit of possible interventions. The classical methodology often leads to the choice of conventional plant systems, while RETScreen provides a financial-economic assessment for innovative energy systems and low environmental impact. Computational analysis can help in the design phase, particularly in the case of complex structures with centralized plant systems, by comparing the data returned by the calculation model RETScreen for different design options. For example, the analysis performed on the building, taken as a case study, found that the most suitable plant solution, taking into account technical, economic and environmental aspects, is the one based on a CCHP system (Combined Cooling, Heating, and Power) using an internal combustion engine.

Keywords: energy, system, building, cooling, electrical

Procedia PDF Downloads 573
485 Status of Vocational Education and Training in India: Policies and Practices

Authors: Vineeta Sirohi

Abstract:

The development of critical skills and competencies becomes imperative for young people to cope with the unpredicted challenges of the time and prepare for work and life. Recognizing that education has a critical role in reaching sustainability goals as emphasized by 2030 agenda for sustainability development, educating youth in global competence, meta-cognitive competencies, and skills from the initial stages of formal education are vital. Further, educating for global competence would help in developing work readiness and boost employability. Vocational education and training in India as envisaged in various policy documents remain marginalized in practice as compared to general education. The country is still far away from the national policy goal of tracking 25% of the secondary students at grade eleven and twelve under the vocational stream. In recent years, the importance of skill development has been recognized in the present context of globalization and change in the demographic structure of the Indian population. As a result, it has become a national policy priority and taken up with renewed focus by the government, which has set the target of skilling 500 million people by 2022. This paper provides an overview of the policies, practices, and current status of vocational education and training in India supported by statistics from the National Sample Survey, the official statistics of India. The national policy documents and annual reports of the organizations actively involved in vocational education and training have also been examined to capture relevant data and information. It has also highlighted major initiatives taken by the government to promote skill development. The data indicates that in the age group 15-59 years, only 2.2 percent reported having received formal vocational training, and 8.6 percent have received non-formal vocational training, whereas 88.3 percent did not receive any vocational training. At present, the coverage of vocational education is abysmal as less than 5 percent of the students are covered by the vocational education programme. Besides, launching various schemes to address the mismatch of skills supply and demand, the government through its National Policy on Skill Development and Entrepreneurship 2015 proposes to bring about inclusivity by bridging the gender, social and sectoral divide, ensuring that the skilling needs of socially disadvantaged and marginalized groups are appropriately addressed. It is fundamental that the curriculum is aligned with the demands of the labor market, incorporating more of the entrepreneur skills. Creating nonfarm employment opportunities for educated youth will be a challenge for the country in the near future. Hence, there is a need to formulate specific skill development programs for this sector and also programs for upgrading their skills to enhance their employability. There is a need to promote female participation in work and in non-traditional courses. Moreover, rigorous research and development of a robust information base for skills are required to inform policy decisions on vocational education and training.

Keywords: policy, skill, training, vocational education

Procedia PDF Downloads 153
484 Impact of Customer Experience Quality on Loyalty of Mobile and Fixed Broadband Services: Case Study of Telecom Egypt Group

Authors: Nawal Alawad, Passent Ibrahim Tantawi, Mohamed Abdel Salam Ragheb

Abstract:

Providing customers with quality experiences has been confirmed to be a sustainable, competitive advantage with a distinct financial impact for companies. The success of service providers now relies on their ability to provide customer-centric services. The importance of perceived service quality and customer experience is widely recognized. The focus of this research is in the area of mobile and fixed broadband services. This study is of dual importance both academically and practically. Academically, this research applies a new model investigating the impact of customer experience quality on loyalty based on modifying the multiple-item scale for measuring customers’ service experience in a new area and did not depend on the traditional models. The integrated scale embraces four dimensions: service experience, outcome focus, moments of truth and peace of mind. In addition, it gives a scientific explanation for this relationship so this research fill the gap in such relations in which no one correlate or give explanations for these relations before using such integrated model and this is the first time to apply such modified and integrated new model in telecom field. Practically, this research gives insights to marketers and practitioners to improve customer loyalty through evolving the experience quality of broadband customers which is interpreted to suggested outcomes: purchase, commitment, repeat purchase and word-of-mouth, this approach is one of the emerging topics in service marketing. Data were collected through 412 questionnaires and analyzed by using structural equation modeling.Findings revealed that both outcome focus and moments of truth have a significant impact on loyalty while both service experience and peace of mind have insignificant impact on loyalty.In addition, it was found that 72% of the variation occurring in loyalty is explained by the model. The researcher also measured the net prompters score and gave explanation for the results. Furthermore, assessed customer’s priorities of broadband services. The researcher recommends that the findings of this research will extend to be considered in the future plans of Telecom Egypt Group. In addition, to be applied in the same industry especially in the developing countries that have the same circumstances with similar service settings. This research is a positive contribution in service marketing, particularly in telecom industry for making marketing more reliable as managers can relate investments in service experience directly with the performance closest to income for instance, repurchasing behavior, positive word of mouth and, commitment. Finally, the researcher recommends that future studies should consider this model to explain significant marketing outcomes such as share of wallet and ultimately profitability.

Keywords: broadband services, customer experience quality, loyalty, net promoters score

Procedia PDF Downloads 267
483 Multilocal Youth and the Berlin Digital Industry: Productive Leisure as a Key Factor in European Migration

Authors: Stefano Pelaggi

Abstract:

The research is focused on youth labor and mobility in Berlin. Mobility has become a common denominator in our daily lives but it does not primarily move according to monetary incentives. Labor, knowledge and leisure overlap on this point as cities are trying to attract people who could participate in production of the innovations while the new migrants are experiencing the lifestyle of the host cities. The research will present the project of empirical study focused on Italian workers in the digital industry in Berlin, trying to underline the connection between pleasure, leisure with the choice of life abroad. Berlin has become the epicenter of the European Internet start-up scene, but people suitable to work for digital industries are not moving in Berlin to make a career, most of them are attracted to the city for different reasons. This point makes a clear exception to traditional migration flows, which are always originated from a specific search of employment opportunities or strong ties, usually families, in a place that could guarantee success in finding a job. Even the skilled migration has always been originated from a specific need, finding the right path for a successful professional life. In a society where the lack of free time in our calendar seems to be something to be ashamed, the actors of youth mobility incorporate some categories of experiential tourism within their own life path. Professional aspirations, lifestyle choices of the protagonists of youth mobility are geared towards meeting the desires and aspirations that define leisure. While most of creative work places, in particular digital industries, uses the category of fun as a primary element of corporate policy, virtually extending the time to work for the whole day; more and more people around the world are deciding their path in life, career choices on the basis of indicators linked to the realization of the self, which may include factors like a warm climate, cultural environment. All indicators that are usually eradicated from the hegemonic approach to labor. The interpretative framework commonly used seems to be mostly focused on a dualism between Florida's theories and those who highlight the absence of conflict in his studies. While the flexibility of the new creative industries is minimizing leisure, incorporating elements of leisure itself in work activities, more people choose their own path of life by placing great importance to basic needs, through a gaze on pleasure that is only partially driven by consumption. The multi localism is the co-existence of different identities and cultures that do not conflict because they reject the bind on territory. Local loses its strength of opposition to global, with an attenuation of the whole concept of citizenship, territory and even integration. A similar perspective could be useful to search a new approach to all the studies dedicated to the gentrification process, while studying the new migrations flow.

Keywords: brain drain, digital industry, leisure and gentrification, multi localism

Procedia PDF Downloads 243
482 Innovation Mechanism in Developing Cultural and Creative Industries

Authors: Liou Shyhnan, Chia Han Yang

Abstract:

The study aims to investigate the promotion of innovation in the development of cultural and creative industries (CCI) and apply research on culture and creativity to this promotion. Using the research perspectives of culture and creativity as the starting points, this study has examined the challenges, trends, and opportunities that have emerged from the development of the CCI until the present. It is found that a definite context of cause and effect exist between them, and that a homologous theoretical basis can be used to understand and interpret them. Based on the characteristics of the aforementioned challenges and trends, this study has compiled two main theoretical systems for conducting research on culture and creativity: (i) reciprocal process between creativity and culture, and (ii) a mechanism for innovation involving multicultural convergence. Both theoretical systems were then used as the foundation to arrive at possible research propositions relating to the two developmental systems. This was respectively done through identification of the theoretical context through a literature review, and interviews and observations of actual case studies within Taiwan’s CCI. In so doing, the critical factors that can address the aforementioned challenges and trends were discovered. Our results indicated that, for reciprocal process between creativity and culture, we recognize that culture serves as creative resources in cultural and creative industries. According to shared consensus, culture provides symbolic meanings and emotional attachment for products and experiences offered by CCI. Besides, different cultures vary in their effects on creativity processes and standards, thus engendering distinctive preferences for and evaluations of the creative expressions and experiences of CCIs. In addition, we identify that creativity serves as the engine for driving the continuation and rebirth of cultures. Accounting for the core of culture, the employment of technology, design, and business facilitates the transformation and innovation mechanism for promoting culture continuity. In addition, with cultural centered, the digital technology, design thinking, and business model are critical constitutes of the innovation mechanism to promote the cultural continuity. Regarding cultural preservation and regeneration of local spaces and folk customs, we argue that the preservation and regeneration of local spaces and cultural cultures must embody the interactive experiences of present-day life. And cultural space and folk custom would regenerate with interact and experience in modern life. Regarding innovation mechanism for multicultural convergence, we propose that innovative stakeholders from different disciplines (e.g., creators, designers, engineers, and marketers) in CCIs rely on the establishment of a cocreation mechanism to promote interdisciplinary interaction. Furthermore, CCI development needs to develop a cocreation mechanism for enhancing the interdisciplinary collaboration among CCI innovation stakeholders. We further argue multicultural mixing would enhance innovation in developing CCI, and assuming an open and mutually enlightening attitude to enrich one another’s cultures in the multicultural exchanges under globalization will create diversity in homogenous CCIs. Finally, for promoting innovation in developing cultural and creative industries, we further propose a model for joint knowledge creation that can be established for enhancing the mutual reinforcement of theoretical and practical research on culture and creativity.

Keywords: culture and creativity, innovation, cultural and creative industries, cultural mixing

Procedia PDF Downloads 325
481 Understanding the Impact of Out-of-Sequence Thrust Dynamics on Earthquake Mitigation: Implications for Hazard Assessment and Disaster Planning

Authors: Rajkumar Ghosh

Abstract:

Earthquakes pose significant risks to human life and infrastructure, highlighting the importance of effective earthquake mitigation strategies. Traditional earthquake modelling and mitigation efforts have largely focused on the primary fault segments and their slip behaviour. However, earthquakes can exhibit complex rupture dynamics, including out-of-sequence thrust (OOST) events, which occur on secondary or subsidiary faults. This abstract examines the impact of OOST dynamics on earthquake mitigation strategies and their implications for hazard assessment and disaster planning. OOST events challenge conventional seismic hazard assessments by introducing additional fault segments and potential rupture scenarios that were previously unrecognized or underestimated. Consequently, these events may increase the overall seismic hazard in affected regions. The study reviews recent case studies and research findings that illustrate the occurrence and characteristics of OOST events. It explores the factors contributing to OOST dynamics, such as stress interactions between fault segments, fault geometry, and mechanical properties of fault materials. Moreover, it investigates the potential triggers and precursory signals associated with OOST events to enhance early warning systems and emergency response preparedness. The abstract also highlights the significance of incorporating OOST dynamics into seismic hazard assessment methodologies. It discusses the challenges associated with accurately modelling OOST events, including the need for improved understanding of fault interactions, stress transfer mechanisms, and rupture propagation patterns. Additionally, the abstract explores the potential for advanced geophysical techniques, such as high-resolution imaging and seismic monitoring networks, to detect and characterize OOST events. Furthermore, the abstract emphasizes the practical implications of OOST dynamics for earthquake mitigation strategies and urban planning. It addresses the need for revising building codes, land-use regulations, and infrastructure designs to account for the increased seismic hazard associated with OOST events. It also underscores the importance of public awareness campaigns to educate communities about the potential risks and safety measures specific to OOST-induced earthquakes. This sheds light on the impact of out-of-sequence thrust dynamics in earthquake mitigation. By recognizing and understanding OOST events, researchers, engineers, and policymakers can improve hazard assessment methodologies, enhance early warning systems, and implement effective mitigation measures. By integrating knowledge of OOST dynamics into urban planning and infrastructure development, societies can strive for greater resilience in the face of earthquakes, ultimately minimizing the potential for loss of life and infrastructure damage.

Keywords: earthquake mitigation, out-of-sequence thrust, seismic, satellite imagery

Procedia PDF Downloads 88
480 Edible Active Antimicrobial Coatings onto Plastic-Based Laminates and Its Performance Assessment on the Shelf Life of Vacuum Packaged Beef Steaks

Authors: Andrey A. Tyuftin, David Clarke, Malco C. Cruz-Romero, Declan Bolton, Seamus Fanning, Shashi K. Pankaj, Carmen Bueno-Ferrer, Patrick J. Cullen, Joe P. Kerry

Abstract:

Prolonging of shelf-life is essential in order to address issues such as; supplier demands across continents, economical profit, customer satisfaction, and reduction of food wastage. Smart packaging solutions presented in the form of naturally occurred antimicrobially-active packaging may be a solution to these and other issues. Gelatin film forming solution with adding of natural sourced antimicrobials is a promising tool for the active smart packaging. The objective of this study was to coat conventional plastic hydrophobic packaging material with hydrophilic antimicrobial active beef gelatin coating and conduct shelf life trials on beef sub-primal cuts. Minimal inhibition concentration (MIC) of Caprylic acid sodium salt (SO) and commercially available Auranta FV (AFV) (bitter oranges extract with mixture of nutritive organic acids) were found of 1 and 1.5 % respectively against bacterial strains Bacillus cereus, Pseudomonas fluorescens, Escherichia coli, Staphylococcus aureus and aerobic and anaerobic beef microflora. Therefore SO or AFV were incorporated in beef gelatin film forming solution in concentration of two times of MIC which was coated on a conventional plastic LDPE/PA film on the inner cold plasma treated polyethylene surface. Beef samples were vacuum packed in this material and stored under chilling conditions, sampled at weekly intervals during 42 days shelf life study. No significant differences (p < 0.05) in the cook loss was observed among the different treatments compared to control samples until the day 29. Only for AFV coated beef sample it was 3% higher (37.3%) than the control (34.4 %) on the day 36. It was found antimicrobial films did not protect beef against discoloration. SO containing packages significantly (p < 0.05) reduced Total viable bacterial counts (TVC) compared to the control and AFV samples until the day 35. No significant reduction in TVC was observed between SO and AFV films on the day 42 but a significant difference was observed compared to control samples with a 1.40 log of bacteria reduction on the day 42. AFV films significantly (p < 0.05) reduced TVC compared to control samples from the day 14 until the day 42. Control samples reached the set value of 7 log CFU/g on day 27 of testing, AFV films did not reach this set limit until day 35 and SO films until day 42 of testing. The antimicrobial AFV and SO coated films significantly prolonged the shelf-life of beef steaks by 33 or 55% (on 7 and 14 days respectively) compared to control film samples. It is concluded antimicrobial coated films were successfully developed by coating the inner polyethylene layer of conventional LDPE/PA laminated films after plasma surface treatment. The results indicated that the use of antimicrobial active packaging coated with SO or AFV increased significantly (p < 0.05) the shelf life of the beef sub-primal. Overall, AFV or SO containing gelatin coatings have the potential of being used as effective antimicrobials for active packaging applications for muscle-based food products.

Keywords: active packaging, antimicrobials, edible coatings, food packaging, gelatin films, meat science

Procedia PDF Downloads 303
479 European Electromagnetic Compatibility Directive Applied to Astronomical Observatories

Authors: Oibar Martinez, Clara Oliver

Abstract:

The Cherenkov Telescope Array Project (CTA) aims to build two different observatories of Cherenkov Telescopes, located in Cerro del Paranal, Chile, and La Palma, Spain. These facilities are used in this paper as a case study to investigate how to apply standard Directives on Electromagnetic Compatibility to astronomical observatories. Cherenkov Telescopes are able to provide valuable information from both Galactic and Extragalactic sources by measuring Cherenkov radiation, which is produced by particles which travel faster than light in the atmosphere. The construction requirements demand compliance with the European Electromagnetic Compatibility Directive. The largest telescopes of these observatories, called Large Scale Telescopes (LSTs), are high precision instruments with advanced photomultipliers able to detect the faint sub-nanosecond blue light pulses produced by Cherenkov Radiation. They have a 23-meter parabolic reflective surface. This surface focuses the radiation on a camera composed of an array of high-speed photosensors which are highly sensitive to the radio spectrum pollution. The camera has a field of view of about 4.5 degrees and has been designed for maximum compactness and lowest weight, cost and power consumption. Each pixel incorporates a photo-sensor able to discriminate single photons and the corresponding readout electronics. The first LST is already commissioned and intends to be operated as a service to Scientific Community. Because of this, it must comply with a series of reliability and functional requirements and must have a Conformité Européen (CE) marking. This demands compliance with Directive 2014/30/EU on electromagnetic compatibility. The main difficulty of accomplishing this goal resides on the fact that Conformité Européen marking setups and procedures were implemented for industrial products, whereas no clear protocols have been defined for scientific installations. In this paper, we aim to give an answer to the question on how the directive should be applied to our installation to guarantee the fulfillment of all the requirements and the proper functioning of the telescope itself. Experts in Optics and Electromagnetism were both needed to make these kinds of decisions and match tests which were designed to be made over the equipment of limited dimensions on large scientific plants. An analysis of the elements and configurations most likely to be affected by external interferences and those that are most likely to cause the maximum disturbances was also performed. Obtaining the Conformité Européen mark requires knowing what the harmonized standards are and how the elaboration of the specific requirement is defined. For this type of large installations, one needs to adapt and develop the tests to be carried out. In addition, throughout this process, certification entities and notified bodies play a key role in preparing and agreeing the required technical documentation. We have focused our attention mostly on the technical aspects of each point. We believe that this contribution will be of interest for other scientists involved in applying industrial quality assurance standards to large scientific plant.

Keywords: CE marking, electromagnetic compatibility, european directive, scientific installations

Procedia PDF Downloads 110
478 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations

Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso

Abstract:

Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.

Keywords: pipeline, leakage, detection, AI

Procedia PDF Downloads 191
477 An Algebraic Geometric Imaging Approach for Automatic Dairy Cow Body Condition Scoring System

Authors: Thi Thi Zin, Pyke Tin, Ikuo Kobayashi, Yoichiro Horii

Abstract:

Today dairy farm experts and farmers have well recognized the importance of dairy cow Body Condition Score (BCS) since these scores can be used to optimize milk production, managing feeding system and as an indicator for abnormality in health even can be utilized to manage for having healthy calving times and process. In tradition, BCS measures are done by animal experts or trained technicians based on visual observations focusing on pin bones, pin, thurl and hook area, tail heads shapes, hook angles and short and long ribs. Since the traditional technique is very manual and subjective, the results can lead to different scores as well as not cost effective. Thus this paper proposes an algebraic geometric imaging approach for an automatic dairy cow BCS system. The proposed system consists of three functional modules. In the first module, significant landmarks or anatomical points from the cow image region are automatically extracted by using image processing techniques. To be specific, there are 23 anatomical points in the regions of ribs, hook bones, pin bone, thurl and tail head. These points are extracted by using block region based vertical and horizontal histogram methods. According to animal experts, the body condition scores depend mainly on the shape structure these regions. Therefore the second module will investigate some algebraic and geometric properties of the extracted anatomical points. Specifically, the second order polynomial regression is employed to a subset of anatomical points to produce the regression coefficients which are to be utilized as a part of feature vector in scoring process. In addition, the angles at thurl, pin, tail head and hook bone area are computed to extend the feature vector. Finally, in the third module, the extracted feature vectors are trained by using Markov Classification process to assign BCS for individual cows. Then the assigned BCS are revised by using multiple regression method to produce the final BCS score for dairy cows. In order to confirm the validity of proposed method, a monitoring video camera is set up at the milk rotary parlor to take top view images of cows. The proposed method extracts the key anatomical points and the corresponding feature vectors for each individual cows. Then the multiple regression calculator and Markov Chain Classification process are utilized to produce the estimated body condition score for each cow. The experimental results tested on 100 dairy cows from self-collected dataset and public bench mark dataset show very promising with accuracy of 98%.

Keywords: algebraic geometric imaging approach, body condition score, Markov classification, polynomial regression

Procedia PDF Downloads 158
476 Development of an Improved Paradigm for the Tourism Sector in the Department of Huila, Colombia: A Theoretical and Empirical Approach

Authors: Laura N. Bolivar T.

Abstract:

The tourism importance for regional development is mainly highlighted by the collaborative, cooperating and competitive relationships of the involved agents. The fostering of associativity processes, in particular, the cluster approach emphasizes the beneficial outcomes from the concentration of enterprises, where innovation and entrepreneurship flourish and shape the dynamics for tourism empowerment. Considering the department of Huila, it is located in the south-west of Colombia and holds the biggest coffee production in the country, although it barely contributes to the national GDP. Hence, its economic development strategy is looking for more dynamism and Huila could be consolidated as a leading destination for cultural, ecological and heritage tourism, if at least the public policy making processes for the tourism management of La Tatacoa Desert, San Agustin Park and Bambuco’s National Festival, were implemented in a more efficient manner. In this order of ideas, this study attempts to address the potential restrictions and beneficial factors for the consolidation of the tourism sector of Huila-Colombia as a cluster and how could it impact its regional development. Therefore, a set of theoretical frameworks such as the Tourism Routes Approach, the Tourism Breeding Environment, the Community-based Tourism Method, among others, but also a collection of international experiences describing tourism clustering processes and most outstanding problematics, is analyzed to draw up learning points, structure of proceedings and success-driven factors to be contrasted with the local characteristics in Huila, as the region under study. This characterization involves primary and secondary information collection methods and comprises the South American and Colombian context together with the identification of involved actors and their roles, main interactions among them, major tourism products and their infrastructure, the visitors’ perspective on the situation and a recap of the related needs and benefits regarding the host community. Considering the umbrella concepts, the theoretical and the empirical approaches, and their comparison with the local specificities of the tourism sector in Huila, an array of shortcomings is analytically constructed and a series of guidelines are proposed as a way to overcome them and simultaneously, raise economic development and positively impact Huila’s well-being. This non-exhaustive bundle of guidelines is focused on fostering cooperating linkages in the actors’ network, dealing with Information and Communication Technologies’ innovations, reinforcing the supporting infrastructure, promoting the destinations considering the less known places as well, designing an information system enabling the tourism network to assess the situation based on reliable data, increasing competitiveness, developing participative public policy-making processes and empowering the host community about the touristic richness. According to this, cluster dynamics would drive the tourism sector to meet articulation and joint effort, then involved agents and local particularities would be adequately assisted to cope with the current changing environment of globalization and competition.

Keywords: innovative strategy, local development, network of tourism actors, tourism cluster

Procedia PDF Downloads 141
475 Physico-Mechanical Behavior of Indian Oil Shales

Authors: K. S. Rao, Ankesh Kumar

Abstract:

The search for alternative energy sources to petroleum has increased these days because of increase in need and depletion of petroleum reserves. Therefore the importance of oil shales as an economically viable substitute has increased many folds in last 20 years. The technologies like hydro-fracturing have opened the field of oil extraction from these unconventional rocks. Oil shale is a compact laminated rock of sedimentary origin containing organic matter known as kerogen which yields oil when distilled. Oil shales are formed from the contemporaneous deposition of fine grained mineral debris and organic degradation products derived from the breakdown of biota. Conditions required for the formation of oil shales include abundant organic productivity, early development of anaerobic conditions, and a lack of destructive organisms. These rocks are not gown through the high temperature and high pressure conditions in Mother Nature. The most common approach for oil extraction is drastically breaking the bond of the organics which involves retorting process. The two approaches for retorting are surface retorting and in-situ processing. The most environmental friendly approach for extraction is In-situ processing. The three steps involved in this process are fracturing, injection to achieve communication, and fluid migration at the underground location. Upon heating (retorting) oil shale at temperatures in the range of 300 to 400°C, the kerogen decomposes into oil, gas and residual carbon in a process referred to as pyrolysis. Therefore it is very important to understand the physico-mechenical behavior of such rocks, to improve the technology for in-situ extraction. It is clear from the past research and the physical observations that these rocks will behave as an anisotropic rock so it is very important to understand the mechanical behavior under high pressure at different orientation angles for the economical use of these resources. By knowing the engineering behavior under above conditions will allow us to simulate the deep ground retorting conditions numerically and experimentally. Many researchers have investigate the effect of organic content on the engineering behavior of oil shale but the coupled effect of organic and inorganic matrix is yet to be analyzed. The favourable characteristics of Assam coal for conversion to liquid fuels have been known for a long time. Studies have indicated that these coals and carbonaceous shale constitute the principal source rocks that have generated the hydrocarbons produced from the region. Rock cores of the representative samples are collected by performing on site drilling, as coring in laboratory is very difficult due to its highly anisotropic nature. Different tests are performed to understand the petrology of these samples, further the chemical analyses are also done to exactly quantify the organic content in these rocks. The mechanical properties of these rocks are investigated by considering different anisotropic angles. Now the results obtained from petrology and chemical analysis are correlated with the mechanical properties. These properties and correlations will further help in increasing the producibility of these rocks. It is well established that the organic content is negatively correlated to tensile strength, compressive strength and modulus of elasticity.

Keywords: oil shale, producibility, hydro-fracturing, kerogen, petrology, mechanical behavior

Procedia PDF Downloads 347
474 Is Materiality Determination the Key to Integrating Corporate Sustainability and Maximising Value?

Authors: Ruth Hegarty, Noel Connaughton

Abstract:

Sustainability reporting has become a priority for many global multinational companies. This is associated with ever-increasing expectations from key stakeholders for companies to be transparent about their strategies, activities and management with regard to sustainability issues. The Global Reporting Initiative (GRI) encourages reporters to only provide information on the issues that are really critical in order to achieve the organisation’s goals for sustainability and manage its impact on environment and society. A key challenge for most reporting organisations is how to identify relevant issues for sustainability reporting and prioritise those material issues in accordance with company and stakeholder needs. A recent study indicates that most of the largest companies listed on the world’s stock exchanges are failing to provide data on key sustainability indicators such as employee turnover, energy, greenhouse gas emissions (GHGs), injury rate, pay equity, waste and water. This paper takes an indepth look at the approaches used by a select number of international sustainability leader corporates to identify key sustainability issues. The research methodology involves performing a detailed analysis of the sustainability report content of up to 50 companies listed on the 2014 Dow Jones Sustainability Indices (DJSI). The most recent sustainability report content found on the GRI Sustainability Disclosure Database is then compared with 91 GRI Specific Standard Disclosures and a small number of GRI Standard Disclosures. Preliminary research indicates significant gaps in the information disclosed in corporate sustainability reports versus the indicator content specified in the GRI Content Index. The following outlines some of the key findings to date: Most companies made a partial disclosure with regard to the Economic indicators of climate change risks and infrastructure investments, but did not focus on the associated negative impacts. The top Environmental indicators disclosed were energy consumption and reductions, GHG emissions, water withdrawals, waste and compliance. The lowest rates of indicator disclosure included biodiversity, water discharge, mitigation of environmental impacts of products and services, transport, environmental investments, screening of new suppliers and supply chain impacts. The top Social indicators disclosed were new employee hires, rates of injury, freedom of association in operations, child labour and forced labour. Lesser disclosure rates were reported for employee training, composition of governance bodies and employees, political contributions, corruption and fines for non-compliance. The reporting on most other Social indicators was found to be poor. In addition, most companies give only a brief explanation on how material issues are defined, identified and ranked. Data on the identification of key stakeholders and the degree and nature of engagement for determining issues and their weightings is also lacking. Generally, little to no data is provided on the algorithms used to score an issue. Research indicates that most companies lack a rigorous and thorough methodology to systematically determine the material issues of sustainability reporting in accordance with company and stakeholder needs.

Keywords: identification of key stakeholders, material issues, sustainability reporting, transparency

Procedia PDF Downloads 306
473 Case Study on Innovative Aquatic-Based Bioeconomy for Chlorella sorokiniana

Authors: Iryna Atamaniuk, Hannah Boysen, Nils Wieczorek, Natalia Politaeva, Iuliia Bazarnova, Kerstin Kuchta

Abstract:

Over the last decade due to climate change and a strategy of natural resources preservation, the interest for the aquatic biomass has dramatically increased. Along with mitigation of the environmental pressure and connection of waste streams (including CO2 and heat emissions), microalgae bioeconomy can supply food, feed, as well as the pharmaceutical and power industry with number of value-added products. Furthermore, in comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, thus addressing issues associated with negative social and the environmental impacts. This paper presents the state-of-the art technology for microalgae bioeconomy from cultivation process to production of valuable components and by-streams. Microalgae Chlorella sorokiniana were cultivated in the pilot-scale innovation concept in Hamburg (Germany) using different systems such as race way pond (5000 L) and flat panel reactors (8 x 180 L). In order to achieve the optimum growth conditions along with suitable cellular composition for the further extraction of the value-added components, process parameters such as light intensity, temperature and pH are continuously being monitored. On the other hand, metabolic needs in nutrients were provided by addition of micro- and macro-nutrients into a medium to ensure autotrophic growth conditions of microalgae. The cultivation was further followed by downstream process and extraction of lipids, proteins and saccharides. Lipids extraction is conducted in repeated-batch semi-automatic mode using hot extraction method according to Randall. As solvents hexane and ethanol are used at different ratio of 9:1 and 1:9, respectively. Depending on cell disruption method along with solvents ratio, the total lipids content showed significant variations between 8.1% and 13.9 %. The highest percentage of extracted biomass was reached with a sample pretreated with microwave digestion using 90% of hexane and 10% of ethanol as solvents. Proteins content in microalgae was determined by two different methods, namely: Total Kejadahl Nitrogen (TKN), which further was converted to protein content, as well as Bradford method using Brilliant Blue G-250 dye. Obtained results, showed a good correlation between both methods with protein content being in the range of 39.8–47.1%. Characterization of neutral and acid saccharides from microalgae was conducted by phenol-sulfuric acid method at two wavelengths of 480 nm and 490 nm. The average concentration of neutral and acid saccharides under the optimal cultivation conditions was 19.5% and 26.1%, respectively. Subsequently, biomass residues are used as substrate for anaerobic digestion on the laboratory-scale. The methane concentration, which was measured on the daily bases, showed some variations for different samples after extraction steps but was in the range between 48% and 55%. CO2 which is formed during the fermentation process and after the combustion in the Combined Heat and Power unit can potentially be used within the cultivation process as a carbon source for the photoautotrophic synthesis of biomass.

Keywords: bioeconomy, lipids, microalgae, proteins, saccharides

Procedia PDF Downloads 245
472 Developing a GIS-Based Tool for the Management of Fats, Oils, and Grease (FOG): A Case Study of Thames Water Wastewater Catchment

Authors: Thomas D. Collin, Rachel Cunningham, Bruce Jefferson, Raffaella Villa

Abstract:

Fats, oils and grease (FOG) are by-products of food preparation and cooking processes. FOG enters wastewater systems through a variety of sources such as households, food service establishments, and industrial food facilities. Over time, if no source control is in place, FOG builds up on pipe walls, leading to blockages, and potentially to sewer overflows which are a major risk to the Environment and Human Health. UK water utilities spend millions of pounds annually trying to control FOG. Despite UK legislation specifying that discharge of such material is against the law, it is often complicated for water companies to identify and prosecute offenders. Hence, it leads to uncertainties regarding the attitude to take in terms of FOG management. Research is needed to seize the full potential of implementing current practices. The aim of this research was to undertake a comprehensive study to document the extent of FOG problems in sewer lines and reinforce existing knowledge. Data were collected to develop a model estimating quantities of FOG available for recovery within Thames Water wastewater catchments. Geographical Information System (GIS) software was used in conjunction to integrate data with a geographical component. FOG was responsible for at least 1/3 of sewer blockages in Thames Water waste area. A waste-based approach was developed through an extensive review to estimate the potential for FOG collection and recovery. Three main sources were identified: residential, commercial and industrial. Commercial properties were identified as one of the major FOG producers. The total potential FOG generated was estimated for the 354 wastewater catchments. Additionally, raw and settled sewage were sampled and analysed for FOG (as hexane extractable material) monthly at 20 sewage treatment works (STW) for three years. A good correlation was found with the sampled FOG and population equivalent (PE). On average, a difference of 43.03% was found between the estimated FOG (waste-based approach) and sampled FOG (raw sewage sampling). It was suggested that the approach undertaken could overestimate the FOG available, the sampling could only capture a fraction of FOG arriving at STW, and/or the difference could account for FOG accumulating in sewer lines. Furthermore, it was estimated that on average FOG could contribute up to 12.99% of the primary sludge removed. The model was further used to investigate the relationship between estimated FOG and number of blockages. The higher the FOG potential, the higher the number of FOG-related blockages is. The GIS-based tool was used to identify critical areas (i.e. high FOG potential and high number of FOG blockages). As reported in the literature, FOG was one of the main causes of sewer blockages. By identifying critical areas (i.e. high FOG potential and high number of FOG blockages) the model further explored the potential for source-control in terms of ‘sewer relief’ and waste recovery. Hence, it helped targeting where benefits from implementation of management strategies could be the highest. However, FOG is still likely to persist throughout the networks, and further research is needed to assess downstream impacts (i.e. at STW).

Keywords: fat, FOG, GIS, grease, oil, sewer blockages, sewer networks

Procedia PDF Downloads 209
471 Detection and Quantification of Viable but Not Culturable Vibrio Parahaemolyticus in Frozen Bivalve Molluscs

Authors: Eleonora Di Salvo, Antonio Panebianco, Graziella Ziino

Abstract:

Background: Vibrio parahaemolyticus is a human pathogen that is widely distributed in marine environments. It is frequently isolated from raw seafood, particularly shellfish. Consumption of raw or undercooked seafood contaminated with V. parahaemolyticus may lead to acute gastroenteritis. Vibrio spp. has excellent resistance to low temperatures so it can be found in frozen products for a long time. Recently, the viable but non-culturable state (VBNC) of bacteria has attracted great attention, and more than 85 species of bacteria have been demonstrated to be capable of entering this state. VBNC cells cannot grow in conventional culture medium but are viable and maintain metabolic activity, which may constitute an unrecognized source of food contamination and infection. Also V. parahaemolyticus could exist in VBNC state under nutrient starvation or low-temperature conditions. Aim: The aim of the present study was to optimize methods and investigate V. parahaemolyticus VBNC cells and their presence in frozen bivalve molluscs, regularly marketed. Materials and Methods: propidium monoazide (PMA) was integrated with real-time polymerase chain reaction (qPCR) targeting the tl gene to detect and quantify V. parahaemolyticus in the VBNC state. PMA-qPCR resulted highly specific to V. parahaemolyticus with a limit of detection (LOD) of 10-1 log CFU/mL in pure bacterial culture. A standard curve for V. parahaemolyticus cell concentrations was established with the correlation coefficient of 0.9999 at the linear range of 1.0 to 8.0 log CFU/mL. A total of 77 samples of frozen bivalve molluscs (35 mussels; 42 clams) were subsequently subjected to the qualitative (on alkaline phosphate buffer solution) and quantitative research of V. parahaemolyticus on thiosulfate-citrate-bile salts-sucrose (TCBS) agar (DIFCO) NaCl 2.5%, and incubation at 30°C for 24-48 hours. Real-time PCR was conducted on homogenate samples, in duplicate, with and without propidium monoazide (PMA) dye, and exposed for 45 min under halogen lights (650 W). Total DNA was extracted from cell suspension in homogenate samples according to bolliture protocol. The Real-time PCR was conducted with species-specific primers for V. parahaemolitycus. The RT-PCR was performed in a final volume of 20 µL, containing 10 µL of SYBR Green Mixture (Applied Biosystems), 2 µL of template DNA, 2 µL of each primer (final concentration 0.6 mM), and H2O 4 µL. The qPCR was carried out on CFX96 TouchTM (Bio-Rad, USA). Results: All samples were negative both to the quantitative and qualitative detection of V. parahaemolyticus by the classical culturing technique. The PMA-qPCR let us individuating VBNC V. parahaemolyticus in the 20,78% of the samples evaluated with a value between the Log 10-1 and Log 10-3 CFU/g. Only clams samples were positive for PMA-qPCR detection. Conclusion: The present research is the first evaluating PMA-qPCR assay for detection of VBNC V. parahaemolyticus in bivalve molluscs samples, and the used method was applicable to the rapid control of marketed bivalve molluscs. We strongly recommend to use of PMA-qPCR in order to identify VBNC forms, undetectable by the classic microbiological methods. A precise knowledge of the V.parahaemolyticus in a VBNC form is fundamental for the correct risk assessment not only in bivalve molluscs but also in other seafood.

Keywords: food safety, frozen bivalve molluscs, PMA dye, Real-time PCR, VBNC state, Vibrio parahaemolyticus

Procedia PDF Downloads 139
470 The Role of Building Information Modeling as a Design Teaching Method in Architecture, Engineering and Construction Schools in Brazil

Authors: Aline V. Arroteia, Gustavo G. Do Amaral, Simone Z. Kikuti, Norberto C. S. Moura, Silvio B. Melhado

Abstract:

Despite the significant advances made by the construction industry in recent years, the crystalized absence of integration between the design and construction phases is still an evident and costly problem in building construction. Globally, the construction industry has sought to adopt collaborative practices through new technologies to mitigate impacts of this fragmented process and to optimize its production. In this new technological business environment, professionals are required to develop new methodologies based on the notion of collaboration and integration of information throughout the building lifecycle. This scenario also represents the industry’s reality in developing nations, and the increasing need for overall efficiency has demanded new educational alternatives at the undergraduate and post-graduate levels. In countries like Brazil, it is the common understanding that Architecture, Engineering and Building Construction educational programs are being required to review the traditional design pedagogical processes to promote a comprehensive notion about integration and simultaneity between the phases of the project. In this context, the coherent inclusion of computation design to all segments of the educational programs of construction related professionals represents a significant research topic that, in fact, can affect the industry practice. Thus, the main objective of the present study was to comparatively measure the effectiveness of the Building Information Modeling courses offered by the University of Sao Paulo, the most important academic institution in Brazil, at the Schools of Architecture and Civil Engineering and the courses offered in well recognized BIM research institutions, such as the School of Design in the College of Architecture of the Georgia Institute of Technology, USA, to evaluate the dissemination of BIM knowledge amongst students in post graduate level. The qualitative research methodology was developed based on the analysis of the program and activities proposed by two BIM courses offered in each of the above-mentioned institutions, which were used as case studies. The data collection instruments were a student questionnaire, semi-structured interviews, participatory evaluation and pedagogical practices. The found results have detected a broad heterogeneity of the students regarding their professional experience, hours dedicated to training, and especially in relation to their general knowledge of BIM technology and its applications. The research observed that BIM is mostly understood as an operational tool and not as methodological project development approach, relevant to the whole building life cycle. The present research offers in its conclusion an assessment about the importance of the incorporation of BIM, with efficiency and in its totality, as a teaching method in undergraduate and graduate courses in the Brazilian architecture, engineering and building construction schools.

Keywords: building information modeling (BIM), BIM education, BIM process, design teaching

Procedia PDF Downloads 154
469 Cultural Adaptation of an Appropriate Intervention Tool for Mental Health among the Mohawk in Quebec

Authors: Liliana Gomez Cardona, Mary McComber, Kristyn Brown, Arlene Laliberté, Outi Linnaranta

Abstract:

The history of colonialism and more contemporary political issues have resulted in the exposure of Kanien'kehá:ka: non (Kanien'kehá:ka of Kahnawake) to challenging and even traumatic experiences. Colonization, religious missions, residential schools as well as economic and political marginalization are the factors that have challenged the wellbeing and mental health of these populations. In psychiatry, screening for mental illness is often done using questionnaires with which the patient is expected to respond to how often he/she has certain symptoms. However, the Indigenous view of mental wellbeing may not fit well with this approach. Moreover, biomedical treatments do not always meet the needs of Indigenous people because they do not understand the culture and traditional healing methods that persist in many communities. Assess whether the questionnaires used to measure symptoms, commonly used in psychiatry are appropriate and culturally safe for the Mohawk in Quebec. Identify the most appropriate tool to assess and promote wellbeing and follow the process necessary to improve its cultural sensitivity and safety for the Mohawk population. Qualitative, collaborative, and participatory action research project which respects First Nations protocols and the principles of ownership, control, access, and possession (OCAP). Data collection based on five focus groups with stakeholders working with these populations and members of Indigenous communities. Thematic analysis of the data collected and emerging through an advisory group that led a revision of the content, use, and cultural and conceptual relevance of the instruments. The questionnaires measuring psychiatric symptoms face significant limitations in the local indigenous context. We present the factors that make these tools not relevant among Mohawks. Although the scale called Growth and Empowerment Measure (GEM) was originally developed among Indigenous in Australia, the Mohawk in Quebec found that this tool comprehends critical aspects of their mental health and wellbeing more respectfully and accurately than questionnaires focused on measuring symptoms. We document the process of cultural adaptation of this tool which was supported by community members to create a culturally safe tool that helps in growth and empowerment. The cultural adaptation of the GEM provides valuable information about the factors affecting wellbeing and contributes to mental health promotion. This process improves mental health services by giving health care providers useful information about the Mohawk population and their clients. We believe that integrating this tool in interventions can help create a bridge to improve communication between the Indigenous cultural perspective of the patient and the biomedical view of health care providers. Further work is needed to confirm the clinical utility of this tool in psychological and psychiatric intervention along with social and community services.

Keywords: cultural adaptation, cultural safety, empowerment, Mohawks, mental health, Quebec

Procedia PDF Downloads 153
468 Using Convolutional Neural Networks to Distinguish Different Sign Language Alphanumerics

Authors: Stephen L. Green, Alexander N. Gorban, Ivan Y. Tyukin

Abstract:

Within the past decade, using Convolutional Neural Networks (CNN)’s to create Deep Learning systems capable of translating Sign Language into text has been a breakthrough in breaking the communication barrier for deaf-mute people. Conventional research on this subject has been concerned with training the network to recognize the fingerspelling gestures of a given language and produce their corresponding alphanumerics. One of the problems with the current developing technology is that images are scarce, with little variations in the gestures being presented to the recognition program, often skewed towards single skin tones and hand sizes that makes a percentage of the population’s fingerspelling harder to detect. Along with this, current gesture detection programs are only trained on one finger spelling language despite there being one hundred and forty-two known variants so far. All of this presents a limitation for traditional exploitation for the state of current technologies such as CNN’s, due to their large number of required parameters. This work aims to present a technology that aims to resolve this issue by combining a pretrained legacy AI system for a generic object recognition task with a corrector method to uptrain the legacy network. This is a computationally efficient procedure that does not require large volumes of data even when covering a broad range of sign languages such as American Sign Language, British Sign Language and Chinese Sign Language (Pinyin). Implementing recent results on method concentration, namely the stochastic separation theorem, an AI system is supposed as an operate mapping an input present in the set of images u ∈ U to an output that exists in a set of predicted class labels q ∈ Q of the alphanumeric that q represents and the language it comes from. These inputs and outputs, along with the interval variables z ∈ Z represent the system’s current state which implies a mapping that assigns an element x ∈ ℝⁿ to the triple (u, z, q). As all xi are i.i.d vectors drawn from a product mean distribution, over a period of time the AI generates a large set of measurements xi called S that are grouped into two categories: the correct predictions M and the incorrect predictions Y. Once the network has made its predictions, a corrector can then be applied through centering S and Y by subtracting their means. The data is then regularized by applying the Kaiser rule to the resulting eigenmatrix and then whitened before being split into pairwise, positively correlated clusters. Each of these clusters produces a unique hyperplane and if any element x falls outside the region bounded by these lines then it is reported as an error. As a result of this methodology, a self-correcting recognition process is created that can identify fingerspelling from a variety of sign language and successfully identify the corresponding alphanumeric and what language the gesture originates from which no other neural network has been able to replicate.

Keywords: convolutional neural networks, deep learning, shallow correctors, sign language

Procedia PDF Downloads 100
467 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions

Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes

Abstract:

The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.

Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning

Procedia PDF Downloads 73
466 Gender Stereotypes in the Media Content as an Obstacle for Elimination of Discrimination against Women in the Republic of Serbia

Authors: Mirjana Dokmanovic

Abstract:

The main topic of this paper is the analysis of the presence of gender stereotypes in the media content in the Republic of Serbia with respect to the state commitments to eliminate discrimination against women. The research methodology included the analysis of the media content of six daily newspapers and two magazines on the date of 28 December 2015 and the analysis of the reality TV show programs in 2015 from gender perspective. The methods of the research has also included a desk research and a qualitative analysis of the available data, statistics, policy papers, studies, and reports produced by the government, the Ministry of Culture and Information, the Regulatory Body for Electronic Media, the Press Council, the associations of media professionals, the independent human rights bodies, and civil society organizations (CSOs). As a State Signatory to the Convention on the Elimination of All Forms of Discrimination against Women, the Republic of Serbia has adopted numerous measures in this field, including the Law on Equality between Sexes and the national gender equality strategies. Special attention has been paid to eliminating gender stereotypes and prejudices in the media content and portraying of women. This practice has been forbidden by the Law on Electronic Media, the Law on Public Information and Media, the Law on Public Service Broadcasting and the Bylaw on the Protection of Human Rights in the Provision of Media Services. Despite these commitments, there has not been achieved progress regarding eliminating gender stereotypes in the media content. The research indicates that the media perpetuate traditional gender roles and patriarchal patterns. Female politicians, entrepreneurs, academics, scientists, and engineers have been very rarely portrayed in the media. On the other side, women are in their focus as celebrities, singers, and actresses. Women are underrepresented in the pages related to politics and economy, while they are mostly present in the cover stories related to show-business, health care, family and household matters. Women are three times more than men identified on the basis of their family status, as mothers, wives, daughters, etc. Hate speech, misogyny, and violence against women are often present in the reality TV shows. The abuse of women and their bodies in advertising is still widely present. The cases of domestic violence are still presented with sensationalism, although there has been achieved progress in portraying victims of domestic violence with respect and dignity. The issues related to gender equality and the position of the vulnerable groups of women, such as Roma women or rural women, are not visible in the media. This research, as well as warnings of women’s CSOs and independent human rights bodies, indicates the necessity to implement legal and policy measures in this field consistently and with due diligence. The aim of the paper is to contribute eliminating gender stereotypes in the media content and advancing gender equality.

Keywords: discrimination against women, gender roles, gender stereotypes, media, misogyny, portraying women in the media, prejudices against women, Republic of Serbia

Procedia PDF Downloads 204
465 Anti-Hyperglycemic Effects and Chemical Analysis of Allium sativum Bulbs Growing in Sudan

Authors: Ikram Mohamed Eltayeb Elsiddig, Yacouba Amina Djamila, Amna El Hassan Hamad

Abstract:

Hyperglycemia and diabetes have been treated with several medicinal plants for a long time, meanwhile reduce associated side effects than the synthetic ones. Therefore, the search for more effective and safer anti-diabetic agents derived from plants has become an interest area of active research. A. sativum, belonging to the Liliaceae family is well known for its medicinal uses in African traditional medicine, it used for treating of many human diseases mainly diabetes, high cholesterol, and high blood pressure. The present study was carried out to investigate the anti-hyperglycemic effect of the extracts of A. sativum bulb growing in Sudan on glucose-loaded Wistar albino rats. A. sativum bulbs were collected from local vegetable market at Khourtoum/ Sudan in a fresh form, identified and authenticated by taxonomist, then dried, and extracted with solvents of increasing polarity: petroleum ether, chloroform, ethyl acetate and methanol by using Soxhlet apparatus. The effect of the extracts on glucose uptake was evaluated by using the isolated rats hemidiaphgrams after loading the fasting rats with glucose, and the anti-hyperglycemic effect was investigated on glucose-loaded Wistar albino rats. Their effects were compared to control rats administered with the vehicle and to a standard group administered with Metformin standard drug. The most active extract was analyzed chemically using GC-MS analysis compared to NIST library. The results showed significant anti-diabetic effect of extracts of A. sativum bulb growing in Sudan. Addition to the hypoglycemic activity of A. sativum extracts was found to be decreased with increase in the polarity of the extraction solvent; this may explain the less polarity of substance responsible for the activity and their concentration decreased with polarity increase. The petroleum ether extract possess anti-hyperglycemic activity more significant than the other extracts and the Metformin standard drug with p-value 0.000** of 400mg/kg at 1 hour, 2 hour and four hour; and p-value 0.019*, 0.015* and 0.010* of 200mg/kg at 1 hour, 2 hour and four hour respectively. The GC-MS analysis of petroleum ether extract, with highest anti -diabetes activity showed the presence of Methyl linolate (42.75%), Hexadecanoic acid, methyl ester (10.54%), Methyl α-linolenate (8.36%), Dotriacontane (6.83), Tetrapentacontane (6.33), Methyl 18-methylnonadecanoate (4.8), Phenol,2,2’-methylenebis[6-(1,1-dimethylethyl)-4-methyl] (3.25), Methyl 20-methyl-heneicosanoate (2.70), Pentatriacontane (2.13) and many other minor compounds. The most of these compounds are well known for their anti-diabetic activity. The study concluded that A. sativum bulbs extracts were found to enhanced the reuptake of glucose in the isolated rat hemidiaphragm and have antihyperglycemic effect when evaluated on glucose-loaded albino rats with petroleum ether extract activity more significant than the Metformin standard drug.

Keywords: Allium, anti-hyperglycemic, bulbs, sativum

Procedia PDF Downloads 169
464 Interdisciplinary Method Development - A Way to Realize the Full Potential of Textile Resources

Authors: Nynne Nørup, Julie Helles Eriksen, Rikke M. Moalem, Else Skjold

Abstract:

Despite a growing focus on the high environmental impact of textiles, textile waste is only recently considered as part of the waste field. Consequently, there is a general lack of knowledge and data within this field. Particularly the lack of a common perception of textiles generates several problems e.g., to recognize the full material potential the fraction contains, which is cruel if the textile must enter the circular economy. This study aims to qualify a method to make the resources in textile waste visible in a way that makes it possible to move them as high up in the waste hierarchy as possible. Textiles are complex and cover many different types of products, fibers and combinations of fibers and production methods. In garments alone, there is a great variety, even when narrowing it to only undergarments. However, textile waste is often reduced to one fraction, assessed solely by quantity, and compared to quantities of other waste fractions. Disregarding the complexity and reducing textiles to a single fraction that covers everything made of textiles increase the risk of neglecting the value of the materials, both with regards to their properties and economical. Instead of trying to fit textile waste into the current primarily linear waste system where volume is a key part of the business models, this study focused on integrating textile waste as a resource in the design and production phase. The study combined interdisciplinary methods for determining replacement rates used in Life Cycle Assessments and Mass Flow Analysis methods with the designer’s toolbox to hereby activate the properties of textile waste in a way that can unleash its potential optimally. It was hypothesized that by activating Denmark's tradition for design and high level of craftsmanship, it is possible to find solutions that can be used today and create circular resource models that reduce the use of virgin fibers. Through waste samples, case studies, and testing of various design approaches, this study explored how to functionalize the method so that the product after the end-use is kept as a material and only then processed at fiber level to obtain the best environmental utilization. The study showed that the designers' ability to decode the properties of the materials and understanding of craftsmanship were decisive for how well the materials could be utilized today. The later in the life cycle the textiles appeared as waste, the more demanding the description of the materials to be sufficient, especially if to achieve the best possible use of the resources and thus a higher replacement rate. In addition, it also required adaptation in relation to the current production because the materials often varied more. The study found good indications that part of the solution is to use geodata i.e., where in the life cycle the materials were discarded. An important conclusion is that a fully developed method can help support better utilization of textile resources. However, it stills requires a better understanding of materials by the designers, as well as structural changes in business and society.

Keywords: circular economy, development of sustainable processes, environmental impacts, environmental management of textiles, environmental sustainability through textile recycling, interdisciplinary method development, resource optimization, recycled textile materials and the evaluation of recycling, sustainability and recycling opportunities in the textile and apparel sector

Procedia PDF Downloads 95
463 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker

Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán

Abstract:

The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.

Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation

Procedia PDF Downloads 23
462 Monitoring of Formaldehyde over Punjab Pakistan Using Car Max-Doas and Satellite Observation

Authors: Waqas Ahmed Khan, Faheem Khokhaar

Abstract:

Air pollution is one of the main perpetrators of climate change. GHGs cause melting of glaciers and cause change in temperature and heavy rain fall many gasses like Formaldehyde is not direct precursor that damage ozone like CO2 or Methane but Formaldehyde (HCHO) form glyoxal (CHOCHO) that has effect on ozone. Countries around the globe have unique air quality monitoring protocols to describe local air pollution. Formaldehyde is a colorless, flammable, strong-smelling chemical that is used in building materials and to produce many household products and medical preservatives. Formaldehyde also occurs naturally in the environment. It is produced in small amounts by most living organisms as part of normal metabolic processes. Pakistan lacks the monitoring facilities on larger scale to measure the atmospheric gasses on regular bases. Formaldehyde is formed from Glyoxal and effect mountain biodiversity and livelihood. So its monitoring is necessary in order to maintain and preserve biodiversity. Objective: Present study is aimed to measure atmospheric HCHO vertical column densities (VCDs) obtained from ground-base and compute HCHO data in Punjab and elevated areas (Rawalpindi & Islamabad) by satellite observation during the time period of 2014-2015. Methodology: In order to explore the spatial distributing of H2CO, various fields campaigns including international scientist by using car Max-Doas. Major focus was on the cities along national highways and industrial region of Punjab Pakistan. Level 2 data product of satellite instruments OMI retrieved by differential optical absorption spectroscopy (DOAS) technique are used. Spatio-temporal distribution of HCHO column densities over main cities and region of Pakistan has been discussed. Results: Results show the High HCHO column densities exceeding permissible limit over the main cities of Pakistan particularly the areas with rapid urbanization and enhanced economic growth. The VCDs value over elevated areas of Pakistan like Islamabad, Rawalpindi is around 1.0×1016 to 34.01×1016 Molecules’/cm2. While Punjab has values revolving around the figure 34.01×1016. Similarly areas with major industrial activity showed high amount of HCHO concentrations. Tropospheric glyoxal VCDs were found to be 4.75 × 1015 molecules/cm2. Conclusion: Results shows that monitoring site surrounded by Margalla hills (Islamabad) have higher concentrations of Formaldehyde. Wind data shows that industrial areas and areas having high economic growth have high values as they provide pathways for transmission of HCHO. Results obtained from this study would help EPA, WHO and air protection departments in order to monitor air quality and further preservation and restoration of mountain biodiversity.

Keywords: air quality, formaldehyde, Max-Doas, vertical column densities (VCDs), satellite instrument, climate change

Procedia PDF Downloads 212
461 Nuclear Powered UAV for Surveillances and Aerial Photography

Authors: Rajasekar Elangopandian, Anand Shanmugam

Abstract:

Now-a-days for surveillances unmanned aerial vehicle plays a vital role. Not only for surveillances, aerial photography disaster management and the notice of earth behavior UAV1s envisages meticulously. To reduce the maintenance and fuel nuclear powered Vehicles are greater support. The design consideration is much important for the UAV manufacturing industry and Research and development agency. Eventually design is looking like a pentagon shaped fuselage and black rubber coated paint in order to escape from the enemy radar and other targets. The pentagon shape fuselage has large space to keep the mini nuclear reactor inside and the material is carbon – carbon fiber specially designed by the software called cosmol and hyper mesh 14.2. So the weight consideration will produce the positive result for productivity. The walls of the fuselage are coated with lead and protective shield. A double layer of W/Bi sheet is proposed for radiation protection at the energy range of 70 Kev to 90 Kev. The designed W/bi sheet, only 0.14 mm thick and is 36% light. The properties of the fillers were determined from zeta potential and particle size measurements. The Exposes of the radiation can be attenuated by 3 ways such as minimizing exposure time, Maximizing distance from the radiation source and shielding the whole vehicle. The inside reactor will be switched ON when the UAV starts its cruise. The moderators and the control rods can be inserted by automation technique by newly developed software. The heat generated by the reactor will be used to run the turbine which is fixed inside the UAV called mini turbine with natural rubber composite Shaft radiation shield. Cooling system will be in two mode such as liquid and air cooled. Liquid coolant for the heat regeneration is ordinary water, liquid sodium, helium and the walls are made up of regenerative and radiation protective material. The other components like camera and arms bay will be located at the bottom of the UAV high are specially made products in order to escape from the radiation. They are coated with lead Pb and natural rubber composite material. This technique provides the long rang and endurance for eternal flight mission until we need any changeability of parts or product. This UAV has the special advantage of ` land on String` means it`ll land at electric line to charge the automated electronics. Then the fuel is enriched uranium (< 5% U - 235) contains hundreds of fuel pins. This technique provides eternal duty for surveillances and aerial photography. The landing of the vehicle is ease of operation likewise the takeoff is also easier than any other mechanism which present in nowadays. This UAV gives great immense and immaculate technology for surveillance and target detecting and smashing the target.

Keywords: mini turbine, liquid coolant for the heat regeneration, in order to escape from the radiation, eternal flight mission, it`ll land at electric line

Procedia PDF Downloads 410
460 Effect of Spermidine on Physicochemical Properties of Protein Based Films

Authors: Mohammed Sabbah, Prospero Di Pierro, Raffaele Porta

Abstract:

Protein-based edible films and coatings have attracted an increasing interest in recent years since they might be used to protect pharmaceuticals or improve the shelf life of different food products. Among them, several plant proteins represent an abundant, inexpensive and renewable raw source. These natural biopolymers are used as film forming agents, being able to form intermolecular linkages by various interactions. However, without the addition of a plasticizing agent, many biomaterials are brittle and, consequently, very difficult to be manipulated. Plasticizers are generally small and non-volatile organic additives used to increase film extensibility and reduce its crystallinity, brittleness and water vapor permeability. Plasticizers normally act by decreasing the intermolecular forces along the polymer chains, thus reducing the relative number of polymer-polymer contacts, producing a decrease in cohesion and tensile strength and thereby increasing film flexibility allowing its deformation without rupture. The most commonly studied plasticizers are polyols, like glycerol (GLY) and some mono or oligosaccharides. In particular, GLY not only increases film extensibility but also migrates inside the film network often causing the loss of desirable mechanical properties of the material. Therefore, replacing GLY with a different plasticizer might help to improve film characteristics allowing potential industrial applications. To improve film properties, it seemed of interest to test as plasticizers some cationic small molecules like polyamines (PAs). Putrescine, spermidine (SPD), and spermine are PAs widely distributed in nature and of particular interest for their biological activities that may have some beneficial health effects. Since PAs contains amino instead of hydroxyl functional groups, they are able to trigger ionic interactions with negatively charged proteins. Bitter vetch (Vicia ervilia; BV) is an ancient grain legume crop, originated in the Mediterranean region, which can be found today in many countries around the world. This annual Vicia genus shows several favorable features, being their seeds a cheap and abundant protein source. The main objectives of this study were to investigate the effect of different concentrations of SPD on the mechanical and permeability properties of films prepared with native or heat denatured BV proteins in the presence of different concentrations of SPD and/or GLY. Therefore, a BV seed protein concentrate (BVPC), containing about 77% proteins, was used to prepare film forming solutions (FFSs), whereas GLY and SPD were added as film plasticizers, either singly or in combination, at various concentrations. Since a primary plasticizer is generally defined as a molecule that when added to a material makes it softer, more flexible and easier to be processed, our findings lead to consider SPD as a possible primary plasticizer of protein-based films. In fact, the addition of millimolar concentrations of SPD to BVPC FFS allowed obtaining handleable biomaterials with improved properties. Moreover, SPD can be also considered as a secondary plasticizer, namely an 'extender', because of its ability even to enhance the plasticizing performance of GLY. In conclusion, our studies indicate that innovative edible protein-based films and coatings can be obtained by using PAs as new plasticizers.

Keywords: edible films, glycerol, plasticizers, polyamines, spermidine

Procedia PDF Downloads 197
459 Disposal Behavior of Extreme Poor People Living in Guatemala at the Base of the Pyramid

Authors: Katharina Raab, Ralf Wagner

Abstract:

With the decrease of poverty, the focus on the solid waste challenge shifts away from affluent, mostly Westernized consumers to the base of the pyramid. The relevance of considering the disposal behavior of impoverished people arises from improved welfare, leading to an increase in consumption opportunities and, consequently, of waste production. In combination with the world’s growing population the relevance of the topic increases, because solid waste management has global impacts on consumers’ welfare. The current annual municipal solid waste generation is estimated to 1.9 billion tonnes, 30% remains uncollected. As for the collected 70% is landfilling and dumping, 19% is recycled or recovered, 11% is led to energy recovery facilities. Therefore, aim is to contribute by adding first insights about poor people's disposal behaviors, including the framing of their rationalities, emotions and cognitions. The study provides novel empirical results obtained from qualitative semi-structured in-depth interviews near Guatemala City. In the study’s framework consumers have to choose from three options when deciding what to do with their obsolete possessions: Keeping the product: The main reason for this is the respondent´s emotional attachment to a product. Further, there is a willingness to use the same product under a different scope when it loses its functionality–they recycle their belongings in a customized and sustainable way. Permanently disposing of the product: The study reveals two dominant disposal methods: burning in front of their homes and throwing away in the physical environment. Respondents clearly recognized the disadvantages of burning toxic durables, like electronics. Giving a product away as a gift supports the integration of individuals in their peer networks of family and friends. Temporarily disposing of the product: Was not mentioned–to be specific, rent or lend a product to someone else was out of question. Contrasting the background to which extend poor people are aware of the consequences of their disposal decisions and how they feel about and rationalize their actions were quite unexpected. Respondents reported that they are worried about future consequences with impacts they cannot anticipate now–they are aware that their behaviors harm their health and the environment. Additionally, they expressed concern about the impact this disposal behavior would have on others’ well-being and are therefore sensitive to the waste that surrounds them. Concluding, the BoP-framed life and Westernized consumption, both fit in a circular economy pattern, but the nature of how to recycle and dispose separates these two societal groups. Both systems own a solid waste management system, but people living in slum-type districts and rural areas of poor countries are less interested in connecting to the system–they are primarily afraid of the costs. Further, it can be said that a consumer’s perceived effectiveness is distinct from environmental concerns, but contributes to forecasting certain pro-ecological behaviors. Considering the rationales underlying disposal decisions, thoughtfulness is a well-established determinant of disposition behavior. The precipitating events, emotions and decisions associated with the act of disposing of products are important because these decisions can trigger different results for the disposal process.

Keywords: base of the pyramid, disposal behavior, poor consumers, solid waste

Procedia PDF Downloads 171
458 Investigation of Ground Disturbance Caused by Pile Driving: Case Study

Authors: Thayalan Nall, Harry Poulos

Abstract:

Piling is the most widely used foundation method for heavy structures in poor soil conditions. The geotechnical engineer can choose among a variety of piling methods, but in most cases, driving piles by impact hammer is the most cost-effective alternative. Under unfavourable conditions, driving piles can cause environmental problems, such as noise, ground movements and vibrations, with the risk of ground disturbance leading to potential damage to proposed structures. In one of the project sites in which the authors were involved, three offshore container terminals, namely CT1, CT2 and CT3, were constructed over thick compressible marine mud. The seabed was around 6m deep and the soft clay thickness within the project site varied between 9m and 20m. CT2 and CT3 were connected together and rectangular in shape and were 2600mx800m in size. CT1 was 400m x 800m in size and was located on south opposite of CT2 towards its eastern end. CT1 was constructed first and due to time and environmental limitations, it was supported on a “forest” of large diameter driven piles. CT2 and CT3 are now under construction and are being carried out using a traditional dredging and reclamation approach with ground improvement by surcharging with vertical drains. A few months after the installation of the CT1 piles, a 2600m long sand bund to 2m above mean sea level was constructed along the southern perimeter of CT2 and CT3 to contain the dredged mud that was expected to be pumped. The sand bund was constructed by sand spraying and pumping using a dredging vessel. About 2000m length of the sand bund in the west section was constructed without any major stability issues or any noticeable distress. However, as the sand bund approached the section parallel to CT1, it underwent a series of deep seated failures leading the displaced soft clay materials to heave above the standing water level. The crest of the sand bund was about 100m away from the last row of piles. There were no plausible geological reasons to conclude that the marine mud only across the CT1 region was weaker than over the rest of the site. Hence it was suspected that the pile driving by impact hammer may have caused ground movements and vibrations, leading to generation of excess pore pressures and cyclic softening of the marine mud. This paper investigates the probable cause of failure by reviewing: (1) All ground investigation data within the region; (2) Soil displacement caused by pile driving, using theories similar to spherical cavity expansion; (3) Transfer of stresses and vibrations through the entire system, including vibrations transmitted from the hammer to the pile, and the dynamic properties of the soil; and (4) Generation of excess pore pressure due to ground vibration and resulting cyclic softening. The evidence suggests that the problems encountered at the site were primarily caused by the “side effects” of the pile driving operations.

Keywords: pile driving, ground vibration, excess pore pressure, cyclic softening

Procedia PDF Downloads 236
457 New Media and the Personal Vote in General Elections: A Comparison of Constituency Level Candidates in the United Kingdom and Japan

Authors: Sean Vincent

Abstract:

Within the academic community, there is a consensus that political parties in established liberal democracies are facing a myriad of organisational challenges as a result of falling membership, weakening links to grass-roots support and rising voter apathy. During the same period of party decline and growing public disengagement political parties have become increasingly professionalised. The professionalisation of political parties owes much to changes in technology, with television becoming the dominant medium for political communication. In recent years, however, it has become clear that a new medium of communication is becoming utilised by political parties and candidates – New Media. New Media, a term hard to define but related to internet based communication, offers a potential revolution in political communication. It can be utilised by anyone with access to the internet and its most widely used platforms of communication such as Facebook and Twitter, are free to use. The advent of Web 2.0 has dramatically changed what can be done with the Internet. Websites now allow candidates at the constituency level to fundraise, organise and set out personalised policies. Social media allows them to communicate with supporters and potential voters practically cost-free. As such candidate dependency on the national party for resources and image now lies open to debate. Arguing that greater candidate independence may be a natural next step in light of the contemporary challenges faced by parties, this paper examines how New Media is being used by candidates at the constituency level to increase their personal vote. The paper will present findings from research carried out during two elections – the Japanese Lower House election of 2014 and the UK general election of 2015. During these elections a sample of candidates, totalling 150 candidates, from the three biggest parties in each country were selected and their new media output, specifically candidate websites, Twitter and Facebook output subjected to content analysis. The analysis examines how candidates are using new media to both become more functionally, through fundraising and volunteer mobilisation and politically, through the promotion of personal/local policies, independent from the national party. In order to validate the results of content analysis this paper will also present evidence from interviews carried out with 17 candidates that stood in the 2014 Japanese Lower House election or 2015 UK general election. With a combination of statistical analysis and interviews, several conclusions can be made about the use of New Media at constituency level. The findings show not just a clear difference in the way candidates from each country are using New Media but also differences within countries based upon the particular circumstances of each constituency. While it has not yet replaced traditional methods of fundraising and activist mobilisation, New Media is also becoming increasingly important in campaign organisation and the general consensus amongst candidates is that its importance will continue to grow along as politics in both countries becomes more diffuse.

Keywords: political campaigns, elections, new media, political communication

Procedia PDF Downloads 226