Search results for: user defined HEMS
4175 Revisiting Corporate Social Responsibility in the Lens of Board Accountability
Authors: Jingchen Zhao
Abstract:
Corporate social responsibility (CSR), a major contemporary focus for companies, governments, NGOs and communities, is discussed from a multi-disciplinary perspective. The term is introduced and defined to achieve a combination of economic, social, environmental and philanthropic goals, and its adoption in company law legislations in a few jurisdictions is discussed. Despite its positive social and environmental impacts, the notion has been widely criticised for being ill-defined and fundamentally flawed in the domain of corporate law. The value and effectiveness of CSR have been interrogated for many reasons, always inter-related. This article aims to consider and address some of these problems and assess how CSR could be sharpened and made more effective through the lens of accountability, focussing on the rationale behind and the means of regulation of CSR. The article aims to achieve two interrelated goals. First, it examines the function of accountability in the arguments in favour of CSR by investigating the extent to which the notion of accountability could be used as a criterion for regulating CSR, so that companies may be held accountable for corporate decisions affecting their stakeholders. Second, this article will examine the scope and goals of CSR and board accountability, creating the possibility of a more comprehensive understanding of the two notions from an interactive perspective. In order to link CSR and accountability closely to generate a more appropriate definition of CSR that is could be more appropriately and effectively applied in corporate law, the concept of corporate social accountability (CSA) will be evaluated, with the aim of broadening its latitude beyond disclosure. This will involve a rigorous assessment of the process of fulfilling directors’ duties via questioning from stakeholder groups during meetings or committees, together with explanations and justifications from the board. This will be followed by discussions on enforcement measures in relation to the concept of CSA.Keywords: corporate governance, CSR, board accountability, corporate law
Procedia PDF Downloads 3084174 Maintenance Work Order Management Tool (Desktop & Mobile Solution)
Authors: Haitham Al Rawahi
Abstract:
Oman Electricity Transmission Company (OETC) has implemented Computerized Maintenance Management System (CMMS), which is based on Oracle enterprise asset management model e-AM. This was implemented with cooperation of Nama Shared Services (NSS). CMMS is mainly used to create maintenance work orders with a preconfigured workflow of defined maintenance schedules/plans, required resources, and materials, obtaining shutdown approvals, completing maintenance activities, and closing the work orders. Furthermore, CMMS is also configured with asset failure classifications, asset hierarchy, asset maintenance activities, integration with spare inventories, etc. Since the year 2017, site engineer is working on CMMS by filling-in manually all related maintenance and inspection records on paper forms and then scanning and attaching it in CMMS for further analysis. Site engineer will finalize all paper works at site and then goes back to office to scan and attach it to work order in CMMS. This creates sub tasks for site engineer and makes it very difficult and lengthy process. Also, there is a significant risk for missing or deleted important fields on the paper due to usage of pen to fill the paper. In addition to that, site engineer may take time and days working outside of the office. therefore, OETC has decided to digitize these inspection and maintenance forms in one platform in CMMS, and it can be opened with both functionalities online and offline. The ArcGIS product formats or web-enabled solutions which has ability to access from mobile and desktop devices via arc map modules will be used too. The purpose of interlinking is to setup for maintenance and inspection forms to work orders in e-AM, which the site engineer has daily interactions with. This ArcGIS environment or tool is designed to link with e-AM, so when site engineer opens this application from the site and a window will take him through same ArcGIS. This window opens the maintenance forms and shows the required fields to fill-in and save the work through his mobile application. After saving his work with the availability of network (Off/In) line, notification will trigger to his line manager to review and take further actions (approve/reject/request more information). In this function, the user can see the assigned work orders to his departments as well as chart of all work orders with status. The approver has ability to see the statistics of all work.Keywords: e-AM, GIS, CMMS, integration
Procedia PDF Downloads 974173 The Development of User Behavior in Urban Regeneration Areas by Utilizing the Floating Population Data
Authors: Jung-Hun Cho, Tae-Heon Moon, Sun-Young Heo
Abstract:
A lot of urban problems, caused by urbanization and industrialization, have occurred around the world. In particular, the creation of satellite towns, which was attributed to the explicit expansion of the city, has led to the traffic problems and the hollowization of old towns, raising the necessity of urban regeneration in old towns along with the aging of existing urban infrastructure. To select urban regeneration priority regions for the strategic execution of urban regeneration in Korea, the number of population, the number of businesses, and deterioration degree were chosen as standards. Existing standards had a limit in coping with solving urban problems fundamentally and rapidly changing reality. Therefore, it was necessary to add new indicators that can reflect the decline in relevant cities and conditions. In this regard, this study selected Busan Metropolitan City, Korea as the target area as a leading city, where urban regeneration such as an international port city has been activated like Yokohama, Japan. Prior to setting the urban regeneration priority region, the conditions of reality should be reflected because uniform and uncharacterized projects have been implemented without a quantitative analysis about population behavior within the region. For this reason, this study conducted a characterization analysis and type classification, based on the user behaviors by using representative floating population of the big data, which is a hot issue all over the society in recent days. The target areas were analyzed in this study. While 23 regions were classified as three types in existing Busan Metropolitan City urban regeneration priority region, 23 regions were classified as four types in existing Busan Metropolitan City urban regeneration priority region in terms of the type classification on the basis of user behaviors. Four types were classified as follows; type (Ⅰ) of young people - morning type, Type (Ⅱ) of the old and middle-aged- general type with sharp floating population, type (Ⅲ) of the old and middle aged-24hour-type, and type (Ⅳ) of the old and middle aged with less floating population. Characteristics were shown in each region of four types, and the study results of user behaviors were different from those of existing urban regeneration priority region. According to the results, in type (Ⅰ) young people were the majority around the existing old built-up area, where floating population at dawn is four times more than in other areas. In Type (Ⅱ), there were many old and middle-aged people around the existing built-up area and general neighborhoods, where the average floating population was more than in other areas due to commuting, while in type (Ⅲ), there was no change in the floating population throughout 24 hours, although there were many old and middle aged people in population around the existing general neighborhoods. Type (Ⅳ) includes existing economy-based type, central built-up area type, and general neighborhood type, where old and middle aged people were the majority as a general type of commuting with less floating population. Unlike existing urban regeneration priority region, these types were sub-divided according to types, and in this study, approach methods and basic orientations of urban regeneration were set to reflect the reality to a certain degree including the indicators of effective floating population to identify the dynamic activity of urban areas and existing regeneration priority areas in connection with urban regeneration projects by regions. Therefore, it is possible to make effective urban plans through offering the substantial ground by utilizing scientific and quantitative data. To induce more realistic and effective regeneration projects, the regeneration projects tailored to the present local conditions should be developed by reflecting the present conditions on the formulation of urban regeneration strategic plans.Keywords: floating population, big data, urban regeneration, urban regeneration priority region, type classification
Procedia PDF Downloads 2134172 Performance Evaluation of Routing Protocol in Cognitive Radio with Multi Technological Environment
Authors: M. Yosra, A. Mohamed, T. Sami
Abstract:
Over the past few years, mobile communication technologies have seen significant evolution. This fact promoted the implementation of many systems in a multi-technological setting. From one system to another, the Quality of Service (QoS) provided to mobile consumers gets better. The growing number of normalized standards extends the available services for each consumer, moreover, most of the available radio frequencies have already been allocated, such as 3G, Wifi, Wimax, and LTE. A study by the Federal Communications Commission (FCC) found that certain frequency bands are partially occupied in particular locations and times. So, the idea of Cognitive Radio (CR) is to share the spectrum between a primary user (PU) and a secondary user (SU). The main objective of this spectrum management is to achieve a maximum rate of exploitation of the radio spectrum. In general, the CR can greatly improve the quality of service (QoS) and improve the reliability of the link. The problem will reside in the possibility of proposing a technique to improve the reliability of the wireless link by using the CR with some routing protocols. However, users declared that the links were unreliable and that it was an incompatibility with QoS. In our case, we choose the QoS parameter "bandwidth" to perform a supervised classification. In this paper, we propose a comparative study between some routing protocols, taking into account the variation of different technologies on the existing spectral bandwidth like 3G, WIFI, WIMAX, and LTE. Due to the simulation results, we observe that LTE has significantly higher availability bandwidth compared with other technologies. The performance of the OLSR protocol is better than other on-demand routing protocols (DSR, AODV and DSDV), in LTE technology because of the proper receiving of packets, less packet drop and the throughput. Numerous simulations of routing protocols have been made using simulators such as NS3.Keywords: cognitive radio, multi technology, network simulator (NS3), routing protocol
Procedia PDF Downloads 634171 Monitoring of Sustainability of Extruded Soya Product TRADKON SPC-TEX in Order to Define Expiration Date
Authors: Radovan Čobanović, Milica Rankov Šicar
Abstract:
New attitudes about nutrition impose new styles, and therefore a neNew attitudes about nutrition impose new styles, and therefore a new kind of food. The goal of our work was to define the shelf life of new extruded soya product with minimum 65% of protein based on the analyses. According to the plan it was defined that a certain quantity of the same batch of new product (soybean flakes) which had predicted shelf life of 2 years had to be stored for 24 months in storage and analyzed at the beginning and end of sustainability plan on instrumental analyses (heavy metals, pesticides and mycotoxins) and every month on sensory analyses (odor, taste, color, consistency), microbiological analyses (Salmonella spp., Escherichia coli, Enterobacteriaceae, sulfite-reducing clostridia, Listeria monocytogenes), chemical analyses (protein, ash, fat, crude cellulose, granulation) and at the beginning on GMO analyses. All analyses were tested according to: sensory analyses ISO 6658, Salmonella spp ISO 6579, Escherichia coli ISO 16649-2, Enterobacteriaceae ISO 21528-2, sulfite-reducing clostridia ISO 15213 and Listeria monocytogenes ISO 11290-2, chemical and instrumental analyses Serbian ordinance on the methods of physico-chemical analyses and GMO analyses JRC Compendium. The results obtained after the analyses which were done according to the plan during the 24 months indicate that are no changes of products concerning both sensory and chemical analyses. As far as microbiological results are concerned Salmonella spp was not detected and all other quantitative analyses showed values <10 cfu/g. The other parameters for food safety (heavy metals, pesticides and mycotoxins) were not present in analyzed samples and also all analyzed samples were negative concerning genetic testing. On the basis of monitoring the sample under defined storage conditions and analyses of quality control, GMO analyses and food safety of the sample during the shelf within two years, the results showed that all the parameters of the sample during defined period is in accordance with Serbian regulative so that indicate that predicted shelf life can be adopted.w kind of food. The goal of our work was to define the shelf life of new extruded soya product with minimum 65% of protein based on the analyses. According to the plan it was defined that a certain quantity of the same batch of new product (soybean flakes) which had predicted shelf life of 2 years had to be stored for 24 months in storage and analyzed at the beginning and end of sustainability plan on instrumental analyses (heavy metals, pesticides and mycotoxins) and every month on sensory analyses (odor, taste, color, consistency), microbiological analyses (Salmonella spp., Escherichia coli, Enterobacteriaceae, sulfite-reducin clostridia, Listeria monocytogenes), chemical analyses (protein, ash, fat, crude cellulose, granulation) and at the beginning on GMO analyses. All analyses were tested according: sensory analyses ISO 6658, Salmonella spp ISO 6579, Escherichia coli ISO 16649-2, Enterobacteriaceae ISO 21528-2, sulfite-reducing clostridia ISO 15213 and Listeria monocytogenes ISO 11290-2, chemical and instrumental analyses Serbian ordinance on the methods of physico-chemical analyses and GMO analyses JRC Compendium. The results obtained after the analyses which were done according to the plan during the 24 months indicate that are no changes of products concerning both sensory and chemical analyses. As far as microbiological results are concerned Salmonella spp was not detected and all other quantitative analyses showed values <10 cfu/g. The other parameters for food safety (heavy metals, pesticides and mycotoxins) were not present in analyzed samples and also all analyzed samples were negative concerning genetic testing. On the basis of monitoring the sample under defined storage conditions and analyses of quality control, GMO analyses and food safety of the sample during the shelf within two years, the results showed that all the parameters of the sample during defined period is in accordance with Serbian regulative so that indicate that predicted shelf life can be adopted.Keywords: extruded soya product, food safety analyses, GMO analyses, shelf life
Procedia PDF Downloads 2964170 User Requirements Study in Order to Improve the Quality of Social Robots for Dementia Patients
Authors: Konrad Rejdak
Abstract:
Introduction: Neurodegenerative diseases are frequently accompanied by loss and unwanted change in functional independence, social relationships, and economic circumstances. Currently, the achievements of social robots to date is being projected to improve multidimensional quality of life among people with cognitive impairment and others. Objectives: Identification of particular human needs in the context of the changes occurring in course of neurodegenerative diseases. Methods: Based on the 110 surveys performed in the Medical University of Lublin from medical staff, patients, and caregivers we made prioritization of the users' needs as high, medium, and low. The issues included in the surveys concerned four aspects: user acceptance, functional requirements, the design of the robotic assistant and preferred types of human-robot interaction. Results: We received completed questionnaires; 50 from medical staff, 30 from caregivers and 30 from potential users. Above 90% of the respondents from each of the three groups, accepted a robotic assistant as a potential caregiver. High priority functional capability of assistive technology was to handle emergencies in a private home-like recognizing life-threatening situations and reminding about medication intake. With reference to the design of the robotic assistant, the majority of the respondent would like to have an anthropomorphic appearance with a positive emotionally expressive face. The most important type of human-robot interaction was a voice-operated system and by touchscreen. Conclusion: The results from our study might contribute to a better understanding of the system and users’ requirements for the development of a service robot intended to support patients with dementia.Keywords: assistant robot, dementia, long term care, patients
Procedia PDF Downloads 1544169 From Avatars to Humans: A Hybrid World Theory and Human Computer Interaction Experimentations with Virtual Reality Technologies
Authors: Juan Pablo Bertuzzi, Mauro Chiarella
Abstract:
Employing a communication studies perspective and a socio-technological approach, this paper introduces a theoretical framework for understanding the concept of hybrid world; the avatarization phenomena; and the communicational archetype of co-hybridization. This analysis intends to make a contribution to future design of virtual reality experimental applications. Ultimately, this paper presents an ongoing research project that proposes the study of human-avatar interactions in digital educational environments, as well as an innovative reflection on inner digital communication. The aforementioned project presents the analysis of human-avatar interactions, through the development of an interactive experience in virtual reality. The goal is to generate an innovative communicational dimension that could reinforce the hypotheses presented throughout this paper. Being thought for its initial application in educational environments, the analysis and results of this research are dependent and have been prepared in regard of a meticulous planning of: the conception of a 3D digital platform; the interactive game objects; the AI or computer avatars; the human representation as hybrid avatars; and lastly, the potential of immersion, ergonomics and control diversity that can provide the virtual reality system and the game engine that were chosen. The project is divided in two main axes: The first part is the structural one, as it is mandatory for the construction of an original prototype. The 3D model is inspired by the physical space that belongs to an academic institution. The incorporation of smart objects, avatars, game mechanics, game objects, and a dialogue system will be part of the prototype. These elements have all the objective of gamifying the educational environment. To generate a continuous participation and a large amount of interactions, the digital world will be navigable both, in a conventional device and in a virtual reality system. This decision is made, practically, to facilitate the communication between students and teachers; and strategically, because it will help to a faster population of the digital environment. The second part is concentrated to content production and further data analysis. The challenge is to offer a scenario’s diversity that compels users to interact and to question their digital embodiment. The multipath narrative content that is being applied is focused on the subjects covered in this paper. Furthermore, the experience with virtual reality devices proposes users to experiment in a mixture of a seemingly infinite digital world and a small physical area of movement. This combination will lead the narrative content and it will be crucial in order to restrict user’s interactions. The main point is to stimulate and to grow in the user the need of his hybrid avatar’s help. By building an inner communication between user’s physicality and user’s digital extension, the interactions will serve as a self-guide through the gameworld. This is the first attempt to make explicit the avatarization phenomena and to further analyze the communicational archetype of co-hybridization. The challenge of the upcoming years will be to take advantage from these forms of generalized avatarization, in order to create awareness and establish innovative forms of hybridization.Keywords: avatar, hybrid worlds, socio-technology, virtual reality
Procedia PDF Downloads 1424168 Discussion of Blackness in Wrestling
Authors: Jason Michael Crozier
Abstract:
The wrestling territories of the mid-twentieth century in the United States are widely considered the birthplace of modern professional wrestling, and by many professional wrestlers, to be a beacon of hope for the easing of racial tensions during the civil rights era and beyond. The performers writing on this period speak of racial equality but fail to acknowledge the exploitation of black athletes as a racialized capital commodity who suffered the challenges of systemic racism, codified by a false narrative of aspirational exceptionalism and equality measured by audience diversity. The promoters’ ability to equate racial and capital exploitation with equality leads to a broader discussion of the history of Muscular Christianity in the United States and the exploitation of black bodies. Narratives of racial erasure that dominate the historical discourse when examining athleticism and exceptionalism redefined how blackness existed and how physicality and race are conceived of in sport and entertainment spaces. When discussing the implications of race and professional wrestling, it is important to examine the role of promotions as ‘imagined communities’ where the social agency of wrestlers is defined and quantified based on their ‘desired elements’ as a performer. The intentionally vague nature of this language masks a deep history of racialization that has been perpetuated by promoters and never fully examined by scholars. Sympathetic racism and the omission of cultural identity are also key factors in the limitations and racial barriers placed upon black athletes in the squared circle. The use of sympathetic racism within professional wrestling during the twentieth century defined black athletes into two distinct categorizations, the ‘black savage’ or the ‘black minstrel’. Black wrestlers of the twentieth century were defined by their strength as a capital commodity and their physicality rather than their knowledge of the business and in-ring skill. These performers had little agency in their ability to shape their own character development inside and outside the ring. Promoters would often create personas that heavily racialized the performer by tying them to a regional past or memory, such as that of slavery in the deep south using dog collar matches and adoring black characters in chains. Promoters softened cultural memory by satirizing the historic legacy of slavery and the black identity.Keywords: sympathetic racism, social agency, racial commodification, stereotyping
Procedia PDF Downloads 1354167 A Geo DataBase to Investigate the Maximum Distance Error in Quality of Life Studies
Authors: Paolino Di Felice
Abstract:
The background and significance of this study come from papers already appeared in the literature which measured the impact of public services (e.g., hospitals, schools, ...) on the citizens’ needs satisfaction (one of the dimensions of QOL studies) by calculating the distance between the place where they live and the location on the territory of the services. Those studies assume that the citizens' dwelling coincides with the centroid of the polygon that expresses the boundary of the administrative district, within the city, they belong to. Such an assumption “introduces a maximum measurement error equal to the greatest distance between the centroid and the border of the administrative district.”. The case study, this abstract reports about, investigates the implications descending from the adoption of such an approach but at geographical scales greater than the urban one, namely at the three levels of nesting of the Italian administrative units: the (20) regions, the (110) provinces, and the 8,094 municipalities. To carry out this study, it needs to be decided: a) how to store the huge amount of (spatial and descriptive) input data and b) how to process them. The latter aspect involves: b.1) the design of algorithms to investigate the geometry of the boundary of the Italian administrative units; b.2) their coding in a programming language; b.3) their execution and, eventually, b.4) archiving the results in a permanent support. The IT solution we implemented is centered around a (PostgreSQL/PostGIS) Geo DataBase structured in terms of three tables that fit well to the hierarchy of nesting of the Italian administrative units: municipality(id, name, provinceId, istatCode, regionId, geometry) province(id, name, regionId, geometry) region(id, name, geometry). The adoption of the DBMS technology allows us to implement the steps "a)" and "b)" easily. In particular, step "b)" is simplified dramatically by calling spatial operators and spatial built-in User Defined Functions within SQL queries against the Geo DB. The major findings coming from our experiments can be summarized as follows. The approximation that, on the average, descends from assimilating the residence of the citizens with the centroid of the administrative unit of reference is of few kilometers (4.9) at the municipalities level, while it becomes conspicuous at the other two levels (28.9 and 36.1, respectively). Therefore, studies such as those mentioned above can be extended up to the municipal level without affecting the correctness of the interpretation of the results, but not further. The IT framework implemented to carry out the experiments can be replicated for studies referring to the territory of other countries all over the world.Keywords: quality of life, distance measurement error, Italian administrative units, spatial database
Procedia PDF Downloads 3714166 Drawing Building Blocks in Existing Neighborhoods: An Automated Pilot Tool for an Initial Approach Using GIS and Python
Authors: Konstantinos Pikos, Dimitrios Kaimaris
Abstract:
Although designing building blocks is a procedure used by many planners around the world, there isn’t an automated tool that will help planners and designers achieve their goals with lesser effort. The difficulty of the subject lies in the repeating process of manually drawing lines, while not only it is mandatory to maintain the desirable offset but to also achieve a lesser impact to the existing building stock. In this paper, using Geographical Information Systems (GIS) and the Python programming language, an automated tool integrated into ArcGIS PRO, is being presented. Despite its simplistic enviroment and the lack of specialized building legislation due to the complex state of the field, a planner who is aware of such technical information can use the tool to draw an initial approach of the final building blocks in an area with pre-existing buildings in an attempt to organize the usually sprawling suburbs of a city or any continuously developing area. The tool uses ESRI’s ArcPy library to handle the spatial data, while interactions with the user is made throught Tkinter. The main process consists of a modification of building edgescoordinates, using NumPy library, in an effort to draw the line of best fit, so the user can get the optimal results per block’s side. Finally, after the tool runs successfully, a table of primary planning information is shown, such as the area of the building block and its coverage rate. Regardless of the primary stage of the tool’s development, it is a solid base where potential planners with programming skills could invest, so they can make the tool adapt to their individual needs. An example of the entire procedure in a test area is provided, highlighting both the strengths and weaknesses of the final results.Keywords: arcPy, GIS, python, building blocks
Procedia PDF Downloads 1794165 Book Exchange System with a Hybrid Recommendation Engine
Authors: Nilki Upathissa, Torin Wirasinghe
Abstract:
This solution addresses the challenges faced by traditional bookstores and the limitations of digital media, striking a balance between the tactile experience of printed books and the convenience of modern technology. The book exchange system offers a sustainable alternative, empowering users to access a diverse range of books while promoting community engagement. The user-friendly interfaces incorporated into the book exchange system ensure a seamless and enjoyable experience for users. Intuitive features for book management, search, and messaging facilitate effortless exchanges and interactions between users. By streamlining the process, the system encourages readers to explore new books aligned with their interests, enhancing the overall reading experience. Central to the system's success is the hybrid recommendation engine, which leverages advanced technologies such as Long Short-Term Memory (LSTM) models. By analyzing user input, the engine accurately predicts genre preferences, enabling personalized book recommendations. The hybrid approach integrates multiple technologies, including user interfaces, machine learning models, and recommendation algorithms, to ensure the accuracy and diversity of the recommendations. The evaluation of the book exchange system with the hybrid recommendation engine demonstrated exceptional performance across key metrics. The high accuracy score of 0.97 highlights the system's ability to provide relevant recommendations, enhancing users' chances of discovering books that resonate with their interests. The commendable precision, recall, and F1score scores further validate the system's efficacy in offering appropriate book suggestions. Additionally, the curve classifications substantiate the system's effectiveness in distinguishing positive and negative recommendations. This metric provides confidence in the system's ability to navigate the vast landscape of book choices and deliver recommendations that align with users' preferences. Furthermore, the implementation of this book exchange system with a hybrid recommendation engine has the potential to revolutionize the way readers interact with printed books. By facilitating book exchanges and providing personalized recommendations, the system encourages a sense of community and exploration within the reading community. Moreover, the emphasis on sustainability aligns with the growing global consciousness towards eco-friendly practices. With its robust technical approach and promising evaluation results, this solution paves the way for a more inclusive, accessible, and enjoyable reading experience for book lovers worldwide. In conclusion, the developed book exchange system with a hybrid recommendation engine represents a progressive solution to the challenges faced by traditional bookstores and the limitations of digital media. By promoting sustainability, widening access to printed books, and fostering engagement with reading, this system addresses the evolving needs of book enthusiasts. The integration of user-friendly interfaces, advanced machine learning models, and recommendation algorithms ensure accurate and diverse book recommendations, enriching the reading experience for users.Keywords: recommendation systems, hybrid recommendation systems, machine learning, data science, long short-term memory, recurrent neural network
Procedia PDF Downloads 944164 Enhancement of Primary User Detection in Cognitive Radio by Scattering Transform
Authors: A. Moawad, K. C. Yao, A. Mansour, R. Gautier
Abstract:
The detecting of an occupied frequency band is a major issue in cognitive radio systems. The detection process becomes difficult if the signal occupying the band of interest has faded amplitude due to multipath effects. These effects make it hard for an occupying user to be detected. This work mitigates the missed-detection problem in the context of cognitive radio in frequency-selective fading channel by proposing blind channel estimation method that is based on scattering transform. By initially applying conventional energy detection, the missed-detection probability is evaluated, and if it is greater than or equal to 50%, channel estimation is applied on the received signal followed by channel equalization to reduce the channel effects. In the proposed channel estimator, we modify the Morlet wavelet by using its first derivative for better frequency resolution. A mathematical description of the modified function and its frequency resolution is formulated in this work. The improved frequency resolution is required to follow the spectral variation of the channel. The channel estimation error is evaluated in the mean-square sense for different channel settings, and energy detection is applied to the equalized received signal. The simulation results show improvement in reducing the missed-detection probability as compared to the detection based on principal component analysis. This improvement is achieved at the expense of increased estimator complexity, which depends on the number of wavelet filters as related to the channel taps. Also, the detection performance shows an improvement in detection probability for low signal-to-noise scenarios over principal component analysis- based energy detection.Keywords: channel estimation, cognitive radio, scattering transform, spectrum sensing
Procedia PDF Downloads 1964163 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine
Authors: D. Madhushanka, Y. Liu, H. C. Fernando
Abstract:
Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2
Procedia PDF Downloads 2354162 Delay in the Diagnosis of Tuberculosis and Initiation of TB Treatment in the Private and Public Health Sectors, Udaipur District, Rajasthan, India, Nov 2013
Authors: Yogita Tulsian, R. S. Gupta, K. F. Laserson
Abstract:
Background: Delays in the diagnosis and treatment of TB facilitates disease transmission in the community, so we conducted a study to evaluate the burden of and risk factors for delay in TB diagnosis and initiation of TB treatment among patients in the private and public sectors in Udaipur district, Rajasthan, India. Methods: A retrospective cohort study was conducted among 100 new sputum-positive TB. Patients were interviewed in the intensive phase of treatment September 2013-November 2013 Long total diagnosis delay (TDD) was defined as a time interval between first symptom to confirmed diagnosis > 30 days. Long health treatment delay (HTD) was defined as a time interval between confirmed diagnosis to treatment initiation > 7 days. Results: We observed a median TDD of 55 days (range: 7-136 days) in the public sector and of 92 days (11-380 days) in the private sector. Long TDD in the private sector was significantly associated with middle-higher socio-economic status (Risk Ratio (RR): 2;95% CI: 1.3-3). The reasons reported from the private sector for long TDD were suspect TB patients not advised for sputum examination (RR: 42; 95% CI:2.6-660), practise of self-medication (RR: 17.4; 95% CI: 1.1-267), or lack of awareness (RR: 9.7;95% CI: 0.6-145). The median HTD in the public sector was 3 days (range: 0-14 days), and in the private sector, 2 days (range: 0-11 days) (non-significant difference). Conclusions: Long TDD in private sector may be improved through sputum referral for all suspect TB cases and better education to all regarding TB.Keywords: diagnosis delay, treatment delay, privatesector, public sector
Procedia PDF Downloads 4274161 Performance Evaluation of Fingerprint, Auto-Pin and Password-Based Security Systems in Cloud Computing Environment
Authors: Emmanuel Ogala
Abstract:
Cloud computing has been envisioned as the next-generation architecture of Information Technology (IT) enterprise. In contrast to traditional solutions where IT services are under physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the management of the data and services may not be fully trustworthy. This is due to the fact that the systems are opened to the whole world and as people tries to have access into the system, many people also are there trying day-in day-out on having unauthorized access into the system. This research contributes to the improvement of cloud computing security for better operation. The work is motivated by two problems: first, the observed easy access to cloud computing resources and complexity of attacks to vital cloud computing data system NIC requires that dynamic security mechanism evolves to stay capable of preventing illegitimate access. Second; lack of good methodology for performance test and evaluation of biometric security algorithms for securing records in cloud computing environment. The aim of this research was to evaluate the performance of an integrated security system (ISS) for securing exams records in cloud computing environment. In this research, we designed and implemented an ISS consisting of three security mechanisms of biometric (fingerprint), auto-PIN and password into one stream of access control and used for securing examination records in Kogi State University, Anyigba. Conclusively, the system we built has been able to overcome guessing abilities of hackers who guesses people password or pin. We are certain about this because the added security system (fingerprint) needs the presence of the user of the software before a login access can be granted. This is based on the placement of his finger on the fingerprint biometrics scanner for capturing and verification purpose for user’s authenticity confirmation. The study adopted the conceptual of quantitative design. Object oriented and design methodology was adopted. In the analysis and design, PHP, HTML5, CSS, Visual Studio Java Script, and web 2.0 technologies were used to implement the model of ISS for cloud computing environment. Note; PHP, HTML5, CSS were used in conjunction with visual Studio front end engine design tools and MySQL + Access 7.0 were used for the backend engine and Java Script was used for object arrangement and also validation of user input for security check. Finally, the performance of the developed framework was evaluated by comparing with two other existing security systems (Auto-PIN and password) within the school and the results showed that the developed approach (fingerprint) allows overcoming the two main weaknesses of the existing systems and will work perfectly well if fully implemented.Keywords: performance evaluation, fingerprint, auto-pin, password-based, security systems, cloud computing environment
Procedia PDF Downloads 1404160 A Varicella Outbreak in a Highly Vaccinated School Population in Voluntary 2-Dose Era in Beijing, China
Authors: Chengbin Wang, Li Lu, Luodan Suo, Qinghai Wang, Fan Yang, Xu Wang, Mona Marin
Abstract:
Background: Two-dose varicella vaccination has been recommended in Beijing since November 2012. We investigated a varicella outbreak in a highly vaccinated elementary school population to examine transmission patterns and risk factors for vaccine failure. Methods: A varicella case was defined as an acute generalized maculopapulovesicular rash without other apparent cause in a student attending the school from March 28 to May 17, 2015. Breakthrough varicella was defined as varicella >42 days after last vaccine dose. Vaccination information was collected from immunization records. Information on prior disease and clinical presentation was collected via survey of students’ parents. Results: Of the 1056 school students, 1028 (97.3%) reported no varicella history, of whom 364 (35.4%) had received 1-dose and 650 (63.2%) had received 2-dose varicella vaccine, for 98.6% school-wide vaccination coverage with ≥ 1 dose before the outbreak. A total of 20 cases were identified for an overall attack rate of 1.9%. The index case was in a 2-dose vaccinated student who was not isolated. The majority of cases were breakthrough (19/20, 95%) with attack rates of 7.1% (1/14), 1.6% (6/364) and 2.0% (13/650) among unvaccinated, 1-dose, and 2-dose students, respectively. Most cases had < 50 lesions (18/20, 90%). No difference was found between 1-dose and 2-dose breakthrough cases in disease severity or sociodemographic factors. Conclusion: Moderate 2-dose varicella vaccine coverage was insufficient to prevent a varicella outbreak. Two-dose breakthrough varicella is still contagious. High 2-dose varicella vaccine coverage and timely isolation of ill persons might be needed for varicella outbreak control in the 2-dose era.Keywords: varicella, outbreak, breakthrough varicella, vaccination
Procedia PDF Downloads 3354159 Evaluation of DNA Microarray System in the Identification of Microorganisms Isolated from Blood
Authors: Merih Şimşek, Recep Keşli, Özgül Çetinkaya, Cengiz Demir, Adem Aslan
Abstract:
Bacteremia is a clinical entity with high morbidity and mortality rates when immediate diagnose, or treatment cannot be achieved. Microorganisms which can cause sepsis or bacteremia are easily isolated from blood cultures. Fifty-five positive blood cultures were included in this study. Microorganisms in 55 blood cultures were isolated by conventional microbiological methods; afterwards, microorganisms were defined in terms of the phenotypic aspects by the Vitek-2 system. The same microorganisms in all blood culture samples were defined in terms of genotypic aspects again by Multiplex-PCR DNA Low-Density Microarray System. At the end of the identification process, the DNA microarray system’s success in identification was evaluated based on the Vitek-2 system. The Vitek-2 system and DNA Microarray system were able to identify the same microorganisms in 53 samples; on the other hand, different microorganisms were identified in the 2 blood cultures by DNA Microarray system. The microorganisms identified by Vitek-2 system were found to be identical to 96.4 % of microorganisms identified by DNA Microarrays system. In addition to bacteria identified by Vitek-2, the presence of a second bacterium has been detected in 5 blood cultures by the DNA Microarray system. It was identified 18 of 55 positive blood culture as E.coli strains with both Vitek 2 and DNA microarray systems. The same identification numbers were found 6 and 8 for Acinetobacter baumanii, 10 and 10 for K.pneumoniae, 5 and 5 for S.aureus, 7 and 11 for Enterococcus spp, 5 and 5 for P.aeruginosa, 2 and 2 for C.albicans respectively. According to these results, DNA Microarray system requires both a technical device and experienced staff support; besides, it requires more expensive kits than Vitek-2. However, this method should be used in conjunction with conventional microbiological methods. Thus, large microbiology laboratories will produce faster, more sensitive and more successful results in the identification of cultured microorganisms.Keywords: microarray, Vitek-2, blood culture, bacteremia
Procedia PDF Downloads 3504158 Some Inequalities Related with Starlike Log-Harmonic Mappings
Authors: Melike Aydoğan, Dürdane Öztürk
Abstract:
Let H(D) be the linear space of all analytic functions defined on the open unit disc. A log-harmonic mappings is a solution of the nonlinear elliptic partial differential equation where w(z) ∈ H(D) is second dilatation such that |w(z)| < 1 for all z ∈ D. The aim of this paper is to define some inequalities of starlike logharmonic functions of order α(0 ≤ α ≤ 1).Keywords: starlike log-harmonic functions, univalent functions, distortion theorem
Procedia PDF Downloads 5254157 Prototype for Measuring Blue Light Protection in Sunglasses
Authors: A. D. Loureiro, L. Ventura
Abstract:
Exposure to high-energy blue light has been strongly linked to the development of some eye diseases, such as age-related macular degeneration. Over the past few years, people have become more and more concerned about eye damage from blue light and how it can be prevented. We developed a prototype that allows users to self-check the blue light protection of their sunglasses and determines if the protection is adequate. Weighting functions approximating those defined in ISO 12312-1 were used to measure the luminous transmittance and blue light transmittance of sunglasses. The blue light transmittance value must be less than 1.2 times the luminous transmittance to be considered adequate. The prototype consists of a Golden Dragon Ultra White LED from OSRAM and a TCS3472 photodetector from AMS TAOS. Together, they provide four transmittance values weighted with different functions. These four transmittance values were then linearly combined to produce transmittance values with weighting functions close to those defined in ISO 12312-1 for luminous transmittance and for blue light transmittance. To evaluate our prototype, we used a VARIAN Cary 5000 spectrophotometer, a gold standard in the field, to measure the luminous transmittance and the blue light transmittance of 60 sunglasses lenses. (and Bland-Altman analysis was performed) Bland-Altman analysis was performed and showed non-significant bias and narrow 95% limits of agreement within predefined tolerances for both luminous transmittance and blue light transmittance. The results show that the prototype is a viable means of providing blue light protection information to the general public and a quick and easy way for industry and retailers to test their products. In addition, our prototype plays an important role in educating the public about a feature to look for in sunglasses before purchasing.Keywords: blue light, sunglasses, eye protective devices, transmittance measurement, standards, ISO 12312-1
Procedia PDF Downloads 1644156 A Method to Evaluate and Compare Web Information Extractors
Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman
Abstract:
Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.Keywords: web information extractors, information extraction evaluation method, Google scholar, web
Procedia PDF Downloads 2484155 The Problem of Child Exploitation on Twitter: A Socio-Anthropological Perspective on Content Filtering Gaps
Authors: Samig Ibayev
Abstract:
This research addresses the problem of illegal child abuse content on the Twitter platform bypassing filtering systems and appearing before users from a social-anthropological perspective. Although the wide access opportunities provided by social media platforms to their users are beneficial in many ways, it is seen that they contain gaps that pave the way for the spread of harmful and illegal content. The aim of the study is to examine the inadequacies of the current content filtering mechanisms of the Twitter platform, to understand the psychological effects of young users unintentionally encountering such content and the social dimensions of this situation. The research was conducted with a qualitative approach and was conducted using digital ethnography, content analysis and user experiences on the Twitter platform. Digital ethnography was used to observe the frequency of child abuse content on the platform and how these contents were presented. The content analysis method was used to reveal the gaps in Twitter's current filtering mechanisms. In addition, detailed information was collected on the extent of psychological effects and how the perception of trust in social media changed through interviews with young users exposed to such content. The main contributions of the research are to highlight the weaknesses in the content moderation and filtering mechanisms of social media platforms, to reveal the negative effects of illegal content on users, and to offer suggestions for preventing the spread of such content. As a result, it is suggested that platforms such as Twitter should improve their content filtering policies in order to increase user security and fulfill their social responsibilities. This research aims to create significant awareness about social media content management and ethical responsibilities on digital platforms.Keywords: Twitter, child exploitation, content filtering, digital ethnography, social anthropology
Procedia PDF Downloads 94154 A Controlled Natural Language Assisted Approach for the Design and Automated Processing of Service Level Agreements
Authors: Christopher Schwarz, Katrin Riegler, Erwin Zinser
Abstract:
The management of outsourcing relationships between IT service providers and their customers proofs to be a critical issue that has to be stipulated by means of Service Level Agreements (SLAs). Since service requirements differ from customer to customer, SLA content and language structures vary largely, standardized SLA templates may not be used and an automated processing of SLA content is not possible. Hence, SLA management is usually a time-consuming and inefficient manual process. For overcoming these challenges, this paper presents an innovative and ITIL V3-conform approach for automated SLA design and management using controlled natural language in enterprise collaboration portals. The proposed novel concept is based on a self-developed controlled natural language that follows a subject-predicate-object approach to specify well-defined SLA content structures that act as templates for customized contracts and support automated SLA processing. The derived results eventually enable IT service providers to automate several SLA request, approval and negotiation processes by means of workflows and business rules within an enterprise collaboration portal. The illustrated prototypical realization gives evidence of the practical relevance in service-oriented scenarios as well as the high flexibility and adaptability of the presented model. Thus, the prototype enables the automated creation of well defined, customized SLA documents, providing a knowledge representation that is both human understandable and machine processable.Keywords: automated processing, controlled natural language, knowledge representation, information technology outsourcing, service level management
Procedia PDF Downloads 4324153 Role of Biomaterial Surface Nanotopography on Protein Unfolding and Immune Response
Authors: Rahul Madathiparambil Visalakshan, Alex Cavallaro, John Hayball, Krasimir Vasilev
Abstract:
The role of biomaterial surface nanotopograhy on fibrinogen adsorption and unfolding, and the subsequent immune response were studied. Inconsistent topography and varying chemical functionalities along with a lack of reproducibility pose a challenge in determining the specific effects of nanotopography or chemistry on proteins and cells. It is important to have a well-defined nanotopography with a homogeneous chemistry to study the real effect of nanotopography on biological systems. Therefore, we developed a technique that can produce well-defined and highly reproducible topography to identify the role of specific roughness, size, height and density with the presence of homogeneous chemical functionality. Using plasma polymerisation of oxazoline monomers and immobilized gold nanoparticles we created surfaces with an equal number density of nanoparticles of different sizes. This surface was used to study the role of surface nanotopography and the interplay of surface chemistry on proteins and immune cells. The effect of nanotopography on fibrinogen adsorption was investigated using Quartz Cristal Microbalance with Dissipation and micro BCA. The mass of fibrinogen adsorbed on the surface increased with increasing size of nano-topography. Protein structural changes up on adsorption to the nano rough surface was studied using circular dichroism spectroscopy. Fibrinogen unfolding varied depending on the specific nanotopography of the surfaces. It was revealed that the in vitro immune response to the nanotopography surfaces changed due to this protein unfolding.Keywords: biomaterial inflammation, protein and cell responses, protein unfolding, surface nanotopography
Procedia PDF Downloads 1764152 Philippine Film Industry and Cultural Policy: A Critical Analysis and Case Study
Authors: Michael Kho Lim
Abstract:
This paper examines the status of the film industry as an industry in the Philippines—where or how it is classified in the Philippine industrial classification system and how this positioning gives the film industry an identity (or not) and affects (film) policy development and impacts the larger national economy. It is important to look at how the national government recognises Philippine cinema officially, as this will have a direct and indirect impact on the industry in terms of its representation, conduct of business, international relations, and most especially its implications on policy development and implementation. Therefore, it is imperative that the ‘identity’ of Philippine cinema be clearly established and defined in the overall industrial landscape. Having a clear understanding of Philippine cinema’s industry status provides a better view of the bigger picture and helps us determine cinema’s position in the national agenda in terms of priority setting, future direction and how the state perceives and thereby values the film industry as an industry. This will then serve as a frame of reference that will anchor the succeeding discussion. Once the Philippine film industry status is identified, the paper will then clarify how cultural policy is defined, understood, and applied in the Philippines in relation to Philippine cinema by reviewing and analyzing existing policy documents and pending bills in the Philippine Congress and Senate. Lastly, the paper delves into the roles that (national) cultural institutions and industry organisations play as primary drivers or support mechanisms and how they become platforms (or not) for the upliftment of the independent film sector and towards the sustainability of the film industry. The paper concludes by arguing that the role of the government and how government officials perceive and treats culture is far more important than cultural policy itself, as these policies emanate from them.Keywords: cultural and creative industries, cultural policy, film industry, Philippine cinema
Procedia PDF Downloads 4414151 An Integrated Theoretical Framework on Mobile-Assisted Language Learning: User’s Acceptance Behavior
Authors: Gyoomi Kim, Jiyoung Bae
Abstract:
In the field of language education research, there are not many tries to empirically examine learners’ acceptance behavior and related factors of mobile-assisted language learning (MALL). This study is one of the few attempts to propose an integrated theoretical framework that explains MALL users’ acceptance behavior and potential factors. Constructs from technology acceptance model (TAM) and MALL research are tested in the integrated framework. Based on previous studies, a hypothetical model was developed. Four external variables related to the MALL user’s acceptance behavior were selected: subjective norm, content reliability, interactivity, self-regulation. The model was also composed of four other constructs: two latent variables, perceived ease of use and perceived usefulness, were considered as cognitive constructs; attitude toward MALL as an affective construct; behavioral intention to use MALL as a behavioral construct. The participants were 438 undergraduate students who enrolled in an intensive English program at one university in Korea. This particular program was held in January 2018 using the vacation period. The students were given eight hours of English classes each day from Monday to Friday for four weeks and asked to complete MALL courses for practice outside the classroom. Therefore, all participants experienced blended MALL environment. The instrument was a self-response questionnaire, and each construct was measured by five questions. Once the questionnaire was developed, it was distributed to the participants at the final ceremony of the intensive program in order to collect the data from a large number of the participants at a time. The data showed significant evidence to support the hypothetical model. The results confirmed through structural equation modeling analysis are as follows: First, four external variables such as subjective norm, content reliability, interactivity, and self-regulation significantly affected perceived ease of use. Second, subjective norm, content reliability, self-regulation, perceived ease of use significantly affected perceived usefulness. Third, perceived usefulness and perceived ease of use significantly affected attitude toward MALL. Fourth, attitude toward MALL and perceived usefulness significantly affected behavioral intention to use MALL. These results implied that the integrated framework from TAM and MALL could be useful when adopting MALL environment to university students or adult English learners. Key constructs except interactivity showed significant relationships with one another and had direct and indirect impacts on MALL user’s acceptance behavior. Therefore, the constructs and validated metrics is valuable for language researchers and educators who are interested in MALL.Keywords: blended MALL, learner factors/variables, mobile-assisted language learning, MALL, technology acceptance model, TAM, theoretical framework
Procedia PDF Downloads 2384150 Life-Cycle Assessment of Residential Buildings: Addressing the Influence of Commuting
Authors: J. Bastos, P. Marques, S. Batterman, F. Freire
Abstract:
Due to demands of a growing urban population, it is crucial to manage urban development and its associated environmental impacts. While most of the environmental analyses have addressed buildings and transportation separately, both the design and location of a building affect environmental performance and focusing on one or the other can shift impacts and overlook improvement opportunities for more sustainable urban development. Recently, several life-cycle (LC) studies of residential buildings have integrated user transportation, focusing exclusively on primary energy demand and/or greenhouse gas emissions. Additionally, most papers considered only private transportation (mainly car). Although it is likely to have the largest share both in terms of use and associated impacts, exploring the variability associated with mode choice is relevant for comprehensive assessments and, eventually, for supporting decision-makers. This paper presents a life-cycle assessment (LCA) of a residential building in Lisbon (Portugal), addressing building construction, use and user transportation (commuting with private and public transportation). Five environmental indicators or categories are considered: (i) non-renewable primary energy (NRE), (ii) greenhouse gas intensity (GHG), (iii) eutrophication (EUT), (iv) acidification (ACID), and (v) ozone layer depletion (OLD). In a first stage, the analysis addresses the overall life-cycle considering the statistical model mix for commuting in the residence location. Then, a comparative analysis compares different available transportation modes to address the influence mode choice variability has on the results. The results highlight the large contribution of transportation to the overall LC results in all categories. NRE and GHG show strong correlation, as the three LC phases contribute with similar shares to both of them: building construction accounts for 6-9%, building use for 44-45%, and user transportation for 48% of the overall results. However, for other impact categories there is a large variation in the relative contribution of each phase. Transport is the most significant phase in OLD (60%); however, in EUT and ACID building use has the largest contribution to the overall LC (55% and 64%, respectively). In these categories, transportation accounts for 31-38%. A comparative analysis was also performed for four alternative transport modes for the household commuting: car, bus, motorcycle, and company/school collective transport. The car has the largest results in all impact categories. When compared to the overall LC with commuting by car, mode choice accounts for a variability of about 35% in NRE, GHG and OLD (the categories where transportation accounted for the largest share of the LC), 24% in EUT and 16% in ACID. NRE and GHG show a strong correlation because all modes have internal combustion engines. The second largest results for NRE, GHG and OLD are associated with commuting by motorcycle; however, for ACID and EUT this mode has better performance than bus and company/school transport. No single transportation mode performed best in all impact categories. Integrated assessments of buildings are needed to avoid shifts of impacts between life-cycle phases and environmental categories, and ultimately to support decision-makers.Keywords: environmental impacts, LCA, Lisbon, transport
Procedia PDF Downloads 3624149 Towards a Systematic Evaluation of Web Design
Authors: Ivayla Trifonova, Naoum Jamous, Holger Schrödl
Abstract:
A good web design is a prerequisite for a successful business nowadays, especially since the internet is the most common way for people to inform themselves. Web design includes the optical composition, the structure, and the user guidance of websites. The importance of each website leads to the question if there is a way to measure its usefulness. The aim of this paper is to suggest a methodology for the evaluation of web design. The desired outcome is to have an evaluation that is concentrated on a specific website and its target group.Keywords: evaluation methodology, factor analysis, target group, web design
Procedia PDF Downloads 6354148 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 1464147 A Hybrid Multi-Criteria Hotel Recommender System Using Explicit and Implicit Feedbacks
Authors: Ashkan Ebadi, Adam Krzyzak
Abstract:
Recommender systems, also known as recommender engines, have become an important research area and are now being applied in various fields. In addition, the techniques behind the recommender systems have been improved over the time. In general, such systems help users to find their required products or services (e.g. books, music) through analyzing and aggregating other users’ activities and behavior, mainly in form of reviews, and making the best recommendations. The recommendations can facilitate user’s decision making process. Despite the wide literature on the topic, using multiple data sources of different types as the input has not been widely studied. Recommender systems can benefit from the high availability of digital data to collect the input data of different types which implicitly or explicitly help the system to improve its accuracy. Moreover, most of the existing research in this area is based on single rating measures in which a single rating is used to link users to items. This paper proposes a highly accurate hotel recommender system, implemented in various layers. Using multi-aspect rating system and benefitting from large-scale data of different types, the recommender system suggests hotels that are personalized and tailored for the given user. The system employs natural language processing and topic modelling techniques to assess the sentiment of the users’ reviews and extract implicit features. The entire recommender engine contains multiple sub-systems, namely users clustering, matrix factorization module, and hybrid recommender system. Each sub-system contributes to the final composite set of recommendations through covering a specific aspect of the problem. The accuracy of the proposed recommender system has been tested intensively where the results confirm the high performance of the system.Keywords: tourism, hotel recommender system, hybrid, implicit features
Procedia PDF Downloads 2724146 The Synthesis and Analysis of Two Long Lasting Phosphorescent Compounds: SrAl2O4: Eu2+, Dy3+
Authors: Ghayah Alsaleem
Abstract:
This research project focussed on specific compounds, whereas a literature review was completed on the broader subject of long-lasting phosphorescence. For the review and subsequent laboratory work, long lasting phosphorescence compounds were defined as materials that have an afterglow decay time greater than a few minutes. The decay time is defined as the time between the end of excitation and the moment the light intensity drops below 0.32mcd/m2. This definition is widely used in industry and in most research studies. The experimental work focused on known long-lasting phosphorescence compounds – strontium aluminate (SrAl2O4: Eu2+, Dy3+). At first, preparation was similar to literary methods. Temperature, dopant levels and mixing methods were then varied in order to expose their effects on long-lasting phosphorescence. The effect of temperature was investigated for SrAl2O4: Eu2+, Dy3+, and resulted in the discovery that 1350°C was the only temperature that the compound could be heated to in the Differential scanning calorimetry (DSC) in order to achieve any phosphorescence. However, no temperatures above 1350°C were investigated. The variation of mixing method and co-dopant level in the strontium aluminate compounds resulted in the finding that the dry mixing method using a Turbula mixer resulted in the longest afterglow. It was also found that an increase of europium inclusion, from 1mol% to 2mol% in these compounds, increased the brightest of the phosphorescence. As this increased batch was mixed using sonication, the phosphorescent time was actually reduced which produced green long-lasting phosphorescence for up to 20 minutes following 30 minutes excitation and 50 minutes when the europium content was doubled and mixed using sonication.Keywords: long lasting, phosphorescence, excitation, europium
Procedia PDF Downloads 181