Search results for: analysis and real time information about liquefaction
44397 Affective Robots: Evaluation of Automatic Emotion Recognition Approaches on a Humanoid Robot towards Emotionally Intelligent Machines
Authors: Silvia Santano Guillén, Luigi Lo Iacono, Christian Meder
Abstract:
One of the main aims of current social robotic research is to improve the robots’ abilities to interact with humans. In order to achieve an interaction similar to that among humans, robots should be able to communicate in an intuitive and natural way and appropriately interpret human affects during social interactions. Similarly to how humans are able to recognize emotions in other humans, machines are capable of extracting information from the various ways humans convey emotions—including facial expression, speech, gesture or text—and using this information for improved human computer interaction. This can be described as Affective Computing, an interdisciplinary field that expands into otherwise unrelated fields like psychology and cognitive science and involves the research and development of systems that can recognize and interpret human affects. To leverage these emotional capabilities by embedding them in humanoid robots is the foundation of the concept Affective Robots, which has the objective of making robots capable of sensing the user’s current mood and personality traits and adapt their behavior in the most appropriate manner based on that. In this paper, the emotion recognition capabilities of the humanoid robot Pepper are experimentally explored, based on the facial expressions for the so-called basic emotions, as well as how it performs in contrast to other state-of-the-art approaches with both expression databases compiled in academic environments and real subjects showing posed expressions as well as spontaneous emotional reactions. The experiments’ results show that the detection accuracy amongst the evaluated approaches differs substantially. The introduced experiments offer a general structure and approach for conducting such experimental evaluations. The paper further suggests that the most meaningful results are obtained by conducting experiments with real subjects expressing the emotions as spontaneous reactions.Keywords: affective computing, emotion recognition, humanoid robot, human-robot-interaction (HRI), social robots
Procedia PDF Downloads 23544396 A Tool for Facilitating an Institutional Risk Profile Definition
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.Keywords: digital information management, file format, endangerment analysis, fuzzy models
Procedia PDF Downloads 40444395 Use of PACER Application as Physical Activity Assessment Tool: Results of a Reliability and Validity Study
Authors: Carine Platat, Fatima Qshadi, Ghofran Kayed, Nour Hussein, Amjad Jarrar, Habiba Ali
Abstract:
Nowadays, smartphones are very popular. They are offering a variety of easy-to-use and free applications among which step counters and fitness tests. The number of users is huge making of such applications a potentially efficient new strategy to encourage people to become more active. Nonetheless, data on their reliability and validity are very scarce and when available, they are often negative and contradictory. Besides, weight status, which is likely to introduce a bias in the physical activity assessment, was not often considered. Hence, the use of these applications as motivational tool, assessment tool and in research is questionable. PACER is one of the free step counters application. Even though it is one of the best rated free application by users, it has never been tested for reliability and validity. Prior any use of PACER, this remains to be investigated. The objective of this work is to investigate the reliability and validity of the smartphone application PACER in measuring the number of steps and in assessing the cardiorespiratory fitness by the 6 minutes walking test. 20 overweight or obese students (10 male and 10 female) were recruited at the United Arab Emirate University, aged between 18 and 25 years old. Reliability and validity were tested in real life conditions and in controlled conditions by using a treadmill. Test-retest experiments were done with PACER on 2 days separated by a week in real life conditions (24 hours each time) and in controlled conditions (30 minutes on treadmill, 3km/h). Validity was tested against the pedometer OMRON in the same conditions. During treadmill test, video was recorded and steps numbers were compared between PACER, pedometer and video. The validity of PACER in estimating the cardiorespiratory fitness (VO2max) as part of the 6 minutes walking test (6MWT) was studied against the 20m shuttle running test. Reliability was studied by calculating intraclass correlation coefficients (ICC), 95% confidence interval (95%CI) and by Bland-Altman plots. Validity was studied by calculating Spearman correlation coefficient (rho) and Bland-Altman plots. PACER reliability was good in both male and female in real life conditions (p≤10-3) but only in female in controlled conditions (p=0.01). PACER was valid against OMRON pedometer in male and female in real life conditions (rho=0.94, p≤10-3 ; rho=0.64, p=0.01, in male and female respectively). In controlled conditions, PACER was not valid against pedometer. But, PACER was valid against video in female (rho=0.72, p≤10-3). PACER was valid against the shuttle run test in male and female (rho-=0.66, p=0.01 ; rho=0.51, p=0.04) to estimate VO2max. This study provides data on the reliability and viability of PACER in overweight or obese male and female young adults. Globally, PACER was shown as reliable and valid in real life conditions in overweight or obese male and female to count steps and assess fitness. This supports the use of PACER to assess and promote physical activity in clinical follow-up and community interventions.Keywords: smartphone application, pacer, reliability, validity, steps, fitness, physical activity
Procedia PDF Downloads 45244394 Rise in Public Interest in COVID-19 Symptoms and the Need for Proper Information: Insights from the Google Trends Analysis
Authors: Jaweriya Aftab, Madho Mal, Hamida Memon
Abstract:
The first case of coronavirus disease 2019 (COVID-19) in Pakistan was recorded on February 26th, 2020. While the country went through various phases of lockdowns, the importance of proper sensitization campaigns was highlighted by healthcare workers to combat misinformation. Past studies via Google trends analysis have shown a rise in public interest in multiple COVID-19-related symptoms as well as cardiovascular symptoms. As there is a paucity of data related to the trends in Pakistan, we conducted a retrospective analysis to bridge further information. Methods: As per the recommendations from past studies, a Google trend analysis was conducted for various symptoms, including ‘Fever’, ‘Chest Pain’, ‘Shortness of Breath’, and ‘Cough’ between 1st January 2019 to 31st December 2021. The trends in various search results were analyzed and modeled. Results: Our analysis found various rises in public interest in the various symptoms (fever, chest pain, shortness of breath, and cough) that correspond closely to the wave of the virus's spread in the country. Conclusion: Our study confirms similar trends in Pakistan as previously reported in studies from India, USA, and UK, whereby the public interest in various COVID-19 symptoms rose with the number of cases. This further highlights the need for a strong approach to combat misinformation during such a critical period.Keywords: covid, trend, Pakistan, public
Procedia PDF Downloads 3844393 Virtual and Visual Reconstructions in Museum Expositions
Authors: Ekaterina Razuvalova, Konstantin Rudenko
Abstract:
In this article the most successful examples of international visual and virtual reconstructions of historical and culture objects, which are based on informative and communicative technologies, are represented. 3D reconstructions can demonstrate outward appearance, visualize different hypothesis, connected to represented object. Virtual reality can give us any daytime and season, any century and environment. We can see how different people from different countries and different era lived; we can get different information about any object; we can see historical complexes in real city environment, which are damaged or vanished. These innovations confirm the fact, that 3D reconstruction is important in museum development. Considering the most interesting examples of visual and virtual reconstructions, we can notice, that visual reconstruction is a 3D image of different objects, historical complexes, buildings and phenomena. They are constant and we can see them only as momentary objects. And virtual reconstruction is some environment with its own time, rules and phenomena. These reconstructions are continuous; seasons, daytime and natural conditions can change there. They can demonstrate abilities of virtual world existence. In conclusion: new technologies give us opportunities to expand the boundaries of museum space, improve abilities of museum expositions, create emotional atmosphere of game immersion, which can interest visitor. Usage of network sources allows increasing the number of visitors and virtual reconstruction opportunities show creative side of museum business.Keywords: computer technologies, historical reconstruction, museums, museum expositions, virtual reconstruction
Procedia PDF Downloads 32944392 Development and Validation of Thermal Stability in Complex System ABDM has two ASIC by NISA and COMSOL Tools
Authors: A. Oukaira, A. Lakhssassi, O. Ettahri
Abstract:
To make a good thermal management in an ABDM (Adapter Board Detector Module) card, we must first control temperature and its gradient from the first step in the design of integrated circuits ASIC of our complex system. In this paper, our main goal is to develop and validate the thermal stability in order to get an idea of the flow of heat around the ASIC in transient and thus address the thermal issues for integrated circuits at the ABDM card. However, we need heat sources simulations for ABDM card to establish its thermal mapping. This led us to perform simulations at each ASIC that will allow us to understand the thermal ABDM map and find real solutions for each one of our complex system that contains 36 ABDM map, taking into account the different layers around ASIC. To do a transient simulation under NISA, we had to build a function of power modulation in time TIMEAMP. The maximum power generated in the ASIC is 0.6 W. We divided the power uniformly in the volume of the ASIC. This power was applied for 5 seconds to visualize the evolution and distribution of heat around the ASIC. The DBC (Dirichlet Boundary conditions) method was applied around the ABDM at 25°C and just after these simulations in NISA tool we will validate them by COMSOL tool, wich is a numerical calculation software for a modular finite element for modeling a wide variety of physical phenomena characterizing a real problem. It will also be a design tool with its ability to handle 3D geometries for complex systems.Keywords: ABDM, APD, thermal mapping, complex system
Procedia PDF Downloads 26444391 Biomass Waste-To-Energy Technical Feasibility Analysis: A Case Study for Processing of Wood Waste in Malta
Authors: G. A. Asciak, C. Camilleri, A. Rizzo
Abstract:
The waste management in Malta is a national challenge. Coupled with Malta’s recent economic boom, which has seen massive growth in several sectors, especially the construction industry, drastic actions need to be taken. Wood waste, currently being dumped in landfills, is one type of waste which has increased astronomically. This research study aims to carry out a thorough examination on the possibility of using this waste as a biomass resource and adopting a waste-to-energy technology in order to generate electrical energy. This study is composed of three distinct yet interdependent phases, namely, data collection from the local SMEs, thermal analysis using the bomb calorimeter, and generation of energy from wood waste using a micro biomass plant. Data collection from SMEs specializing in wood works was carried out to obtain information regarding the available types of wood waste, the annual weight of imported wood, and to analyse the manner in which wood shavings are used after wood is manufactured. From this analysis, it resulted that five most common types of wood available in Malta which would suitable for generating energy are Oak (hardwood), Beech (hardwood), Red Beech (softwood), African Walnut (softwood) and Iroko (hardwood). Subsequently, based on the information collected, a thermal analysis using a 6200 Isoperibol calorimeter on the five most common types of wood was performed. This analysis was done so as to give a clear indication with regards to the burning potential, which will be valuable when testing the wood in the biomass plant. The experiments carried out in this phase provided a clear indication that the African Walnut generated the highest gross calorific value. This means that this type of wood released the highest amount of heat during the combustion in the calorimeter. This is due to the high presence of extractives and lignin, which accounts for a slightly higher gross calorific value. This is followed by Red Beech and Oak. Moreover, based on the findings of the first phase, both the African Walnut and Red Beech are highly imported in the Maltese Islands for use in various purposes. Oak, which has the third highest gross calorific value is the most imported and common wood used. From the five types of wood, three were chosen for use in the power plant on the basis of their popularity and their heating values. The PP20 biomass plant was used to burn the three types of shavings in order to compare results related to the estimated feedstock consumed by the plant, the high temperatures generated, the time taken by the plant to produce gasification temperatures, and the projected electrical power attributed to each wood type. From the experiments, it emerged that whilst all three types reached the required gasification temperature and thus, are feasible for electrical energy generation. African Walnut was deemed to be the most suitable fast-burning fuel. This is followed by Red-beech and Oak, which required a longer period of time to reach the required gasification temperatures. The results obtained provide a clear indication that wood waste can not only be treated instead of being dumped in dumped in landfill but coupled.Keywords: biomass, isoperibol calorimeter, waste-to-energy technology, wood
Procedia PDF Downloads 24344390 A 0-1 Goal Programming Approach to Optimize the Layout of Hospital Units: A Case Study in an Emergency Department in Seoul
Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee
Abstract:
This paper proposes a method to optimize the layout of an emergency department (ED) based on real executions of care processes by considering several planning objectives simultaneously. Recently, demand for healthcare services has been dramatically increased. As the demand for healthcare services increases, so do the need for new healthcare buildings as well as the need for redesign and renovating existing ones. The importance of implementation of a standard set of engineering facilities planning and design techniques has been already proved in both manufacturing and service industry with many significant functional efficiencies. However, high complexity of care processes remains a major challenge to apply these methods in healthcare environments. Process mining techniques applied in this study to tackle the problem of complexity and to enhance care process analysis. Process related information such as clinical pathways extracted from the information system of an ED. A 0-1 goal programming approach is then proposed to find a single layout that simultaneously satisfies several goals. The proposed model solved by optimization software CPLEX 12. The solution reached using the proposed method has 42.2% improvement in terms of walking distance of normal patients and 47.6% improvement in walking distance of critical patients at minimum cost of relocation. It has been observed that lots of patients must unnecessarily walk long distances during their visit to the emergency department because of an inefficient design. A carefully designed layout can significantly decrease patient walking distance and related complications.Keywords: healthcare operation management, goal programming, facility layout problem, process mining, clinical processes
Procedia PDF Downloads 29644389 The Relationship of Building Information Modeling (BIM) Capability in Quantity Surveying Practice and Project Performance
Authors: P. F. Wong, H. Salleh, F. A. Rahim
Abstract:
The adoption of building information modeling (BIM) is increasing in the construction industry. However, quantity surveyors are slow in adoption compared to other professions due to lack of awareness of the BIM’s potential in their profession. It is still unclear on how BIM application can enhance quantity surveyors’ work performance and project performance. The aim of this research is to identify the capabilities of BIM in quantity surveying practices and examine the relationship between BIM capabilities and project performance. Questionnaire survey and interviews were adopted for data collection. Literature reviews identified there are eleven BIM capabilities in quantity surveying practice. Questionnaire results showed that there are several BIM capabilities significantly correlated with project performance in time, cost and quality aspects and the results were validated through interviews. These findings show that BIM has the capabilities to enhance quantity surveyors’ performances and subsequently improved project performance.Keywords: Building Information Modeling (BIM), quantity surveyors, capability, project performance
Procedia PDF Downloads 36944388 Impact of Social Crisis on Property Market Performance and Evolving Strategy for Improved Property Transactions in Crisis Prone Environment: A Case Study of North Eastern Nigeria
Authors: A. Yakub AbdurRaheem
Abstract:
Urban violence in the form of ethnic and religious conflicts have been on the increase in many African cities in the recent years of which most of them are the result of intense and bitter competition for political power, the control of limited economic, social and environmental resources. In Nigeria, the emergence of the Boko Haram insurgency in most parts of the northeastern parts have ignited violence, bloodshed, refugee exodus and internal migration. Not only do the persistent attacks of the sect create widespread insecurity and fear, but it has also stifled normal processes of trade and investments most especially real property investment which is acclaimed to accelerate the economic cycle, thus the need to evolve strategies for an improved property market in such areas. This paper, therefore, examines the impact of this social crisis on effective and efficient utilization of real properties as a resource towards the development of the economy, using a descriptive analysis approach where particular emphasis was based on trends in residential housing values; volume of estimated property transactions and real estate investment decisions by affected individuals. Findings indicate that social crisis in the affected areas have been a clog on the wheels of property development and investment as properties worth hundreds of millions have been destroyed thereby having great impact on property values. Based on these findings, recommendations were made to include the need to strategically continue investing in property during such times, the need for Nigerian government to establish an active conflict monitoring and management unit for the prompt response, encourage community and neighborhood policing to ameliorate security challenges in Nigeria.Keywords: social crisis, economy, resources, property market
Procedia PDF Downloads 23744387 Analyzing Apposition and the Typology of Specific Reference in Newspaper Discourse in Nigeria
Authors: Monday Agbonica Bello Eje
Abstract:
The language of the print media is characterized by the use of apposition. This linguistic element function strategically in journalistic discourse where it is communicatively necessary to name individuals and provide information about them. Linguistic studies on the language of the print media with bias for apposition have largely dwelt on other areas but the examination of the typology of appositive reference in newspaper discourse. Yet, it is capable of revealing ways writers communicate and provide information necessary for readers to follow and understand the message. The study, therefore, analyses the patterns of appositional occurrences and the typology of reference in newspaper articles. The data were obtained from The Punch and Daily Trust Newspapers. A total of six editions of these newspapers were collected randomly spread over three months. News and feature articles were used in the analysis. Guided by the referential theory of meaning in discourse, the appositions identified were subjected to analysis. The findings show that the semantic relation of coreference and speaker coreference have the highest percentage and frequency of occurrence in the data. This is because the subject matter of news reports and feature articles focuses on humans and the events around them; as a result, readers need to be provided with some form of detail and background information in order to identify as well as follow the discourse. Also, the non-referential relation of absolute synonymy and speaker synonymy no doubt have fewer occurrences and percentages in the analysis. This is tied to a major feature of the language of the media: simplicity. The paper concludes that appositions is mainly used for the purpose of providing the reader with much detail. In this way, the writer transmits information which helps him not only to give detailed yet concise descriptions but also in some way help the reader to follow the discourse.Keywords: apposition, discourse, newspaper, Nigeria, reference
Procedia PDF Downloads 17444386 Evidence-Based Investigation of the Phonology of Nigerian Instant Messaging
Authors: Emmanuel Uba, Lily Chimuanya, Maryam Tar
Abstract:
Orthographic engineering is no longer the preserve of the Short Messaging Service (SMS), which is characterised by limited space. Such stylistic creativity or deviation is fast creeping into real-time messaging, popularly known as Instant Messaging (IM), despite the large number of characters allowed. This occurs at various linguistic levels: phonology, morphology, syntax, etc. Nigerians are not immune to this linguistic stylisation. This study investigates the phonological and meta-phonological conventions of the messages sent and received via WhatsApp by Nigerian graduates. This is ontological study of 250 instant messages collected from 98 graduates from different ethnic groups in Nigeria. The selection and analysis of the messages are based on figure and ground principle. The results reveal the use of accent stylisation, phoneme substitution, blending, consonantisation (a specialised form of deletion targeting vowels), numerophony (using a figure/number, usually 1-10, to represent a word or syllable that has the same sound) and phonetic respelling in the IMs sent by Nigerians. The study confirms the existence of linguistic creativity.Keywords: figure and ground principle, instant messaging, linguistic stylisation, meta-phonology
Procedia PDF Downloads 39744385 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study
Authors: Atif Zafar, Fan Haijun
Abstract:
A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.Keywords: field development plan, reservoir characterization, reservoir engineering, well test analysis
Procedia PDF Downloads 36444384 Dynamics of a Reaction-Diffusion Problems Modeling Two Predators Competing for a Prey
Authors: Owolabi Kolade Matthew
Abstract:
In this work, we investigate both the analytical and numerical studies of the dynamical model comprising of three species system. We analyze the linear stability of stationary solutions in the one-dimensional multi-system modeling the interactions of two predators and one prey species. The stability analysis has a lot of implications for understanding the various spatiotemporal and chaotic behaviors of the species in the spatial domain. The analysis results presented have established the possibility of the three interacting species to coexist harmoniously, this feat is achieved by combining the local and global analyzes to determine the global dynamics of the system. In the presence of diffusion, a viable exponential time differencing method is applied to multi-species nonlinear time-dependent partial differential equation to address the points and queries that may naturally arise. The scheme is described in detail, and justified by a number of computational experiments.Keywords: asymptotically stable, coexistence, exponential time differencing method, global and local stability, predator-prey model, nonlinear, reaction-diffusion system
Procedia PDF Downloads 41244383 The Impact of Audit Committee on Real Earnings Management: Evidence from Netherlands
Authors: Sana Masmoudi, Yosra Makni
Abstract:
Regulators highlight the importance of the Audit Committee (AC) as a key internal corporate governance mechanism. One of the most important roles of this committee is to oversee the financial reporting process. The purpose of this paper is to examine the link between the characteristics of an audit committee and the financial reporting quality by investigating whether the formation of audit committees and their characteristics are associated with improved financial reporting quality. This study provides empirical evidence of the association between audit committee independence, financial expertise, gender diversity, and meetings and Real Earnings Management (REM) as a proxy of financial reporting quality. Using data from, with a sample of 80 companies listed on the Amsterdam Stock Exchange during 2010-2017, the study finds that independence and AC Gender diversity are strongly related to financial reporting quality. In fact, these two characteristics constrain REM. The results also suggest that AC-financial expertise reduces to some extent, the likelihood of engaging in REM. These conclusions provide support then to the audit committee requirement under the Dutch Corporate Governance Code rules regarding gender diversity and AC meetings.Keywords: audit committee, financial expertise, independence, real earnings management
Procedia PDF Downloads 17144382 Between AACR2 and RDA What Changes Occurs in Them
Authors: Ibrahim Abdullahi Mohammad
Abstract:
A library catalogue exists not only as an inventory of the collections of the particular library, but also as a retrieval device. It is provided to assist the library user in finding whatever information or information resources they may be looking for. The paper proposes that this location objective of the library catalogue can only be fulfilled, if the library catalogue is constructed, bearing in mind the information needs and searching behavior of the library user. Comparing AACR2 and RDA viz-a-viz the changes RDA has introduced into bibliographic standards, the paper tries to establish the level of viability of RDA in relation to AACR2.Keywords: library catalogue, information retrieval, AACR2, RDA
Procedia PDF Downloads 5444381 Modeling Activity Pattern Using XGBoost for Mining Smart Card Data
Authors: Eui-Jin Kim, Hasik Lee, Su-Jin Park, Dong-Kyu Kim
Abstract:
Smart-card data are expected to provide information on activity pattern as an alternative to conventional person trip surveys. The focus of this study is to propose a method for training the person trip surveys to supplement the smart-card data that does not contain the purpose of each trip. We selected only available features from smart card data such as spatiotemporal information on the trip and geographic information system (GIS) data near the stations to train the survey data. XGboost, which is state-of-the-art tree-based ensemble classifier, was used to train data from multiple sources. This classifier uses a more regularized model formalization to control the over-fitting and show very fast execution time with well-performance. The validation results showed that proposed method efficiently estimated the trip purpose. GIS data of station and duration of stay at the destination were significant features in modeling trip purpose.Keywords: activity pattern, data fusion, smart-card, XGboost
Procedia PDF Downloads 24744380 Comparison and Improvement of the Existing Cone Penetration Test Results: Shear Wave Velocity Correlations for Hungarian Soils
Authors: Ákos Wolf, Richard P. Ray
Abstract:
Due to the introduction of Eurocode 8, the structural design for seismic and dynamic effects has become more significant in Hungary. This has emphasized the need for more effort to describe the behavior of structures under these conditions. Soil conditions have a significant effect on the response of structures by modifying the stiffness and damping of the soil-structural system and by modifying the seismic action as it reaches the ground surface. Shear modulus (G) and shear wave velocity (vs), which are often measured in the field, are the fundamental dynamic soil properties for foundation vibration problems, liquefaction potential and earthquake site response analysis. There are several laboratory and in-situ measurement techniques to evaluate dynamic soil properties, but unfortunately, they are often too expensive for general design practice. However, a significant number of correlations have been proposed to determine shear wave velocity or shear modulus from Cone Penetration Tests (CPT), which are used more and more in geotechnical design practice in Hungary. This allows the designer to analyze and compare CPT and seismic test result in order to select the best correlation equations for Hungarian soils and to improve the recommendations for the Hungarian geologic conditions. Based on a literature review, as well as research experience in Hungary, the influence of various parameters on the accuracy of results will be shown. This study can serve as a basis for selecting and modifying correlation equations for Hungarian soils. Test data are taken from seven locations in Hungary with similar geologic conditions. The shear wave velocity values were measured by seismic CPT. Several factors are analyzed including soil type, behavior index, measurement depth, geologic age etc. for their effect on the accuracy of predictions. The final results show an improved prediction method for Hungarian soilsKeywords: CPT correlation, dynamic soil properties, seismic CPT, shear wave velocity
Procedia PDF Downloads 24644379 Software Quality Assurance in Component Based Software Development – a Survey Analysis
Authors: Abeer Toheed Quadri, Maria Abubakar, Mehreen Sirshar
Abstract:
Component Based Software Development (CBSD) is a new trend in software development. Selection of quality components is not enough to ensure software quality in Component Based Software System (CBSS). A software product is considered to be a quality product if it satisfies its customer’s needs and has minimum defects. Authors’ survey different research papers and analyzes various techniques which ensure software quality in component based software development. This paper includes an investigation about how to improve the quality of a component based software system without effecting quality attributes. The reported information is identified from literature survey. The developments of component based systems are rising as they reduce the development time, effort and cost by means of reuse. After analysis, it has been explored that in order to achieve the quality in a CBSS we need to have the components that are certified through software measure because the predictability of software quality attributes of system depend on the quality attributes of the constituent components, integration process and the framework used.Keywords: CBSD (component based software development), CBSS (component based software system), quality components, SQA (software quality assurance)
Procedia PDF Downloads 41444378 Hierarchical Operation Strategies for Grid Connected Building Microgrid with Energy Storage and Photovoltatic Source
Authors: Seon-Ho Yoon, Jin-Young Choi, Dong-Jun Won
Abstract:
This paper presents hierarchical operation strategies which are minimizing operation error between day ahead operation plan and real time operation. Operating power systems between centralized and decentralized approaches can be represented as hierarchical control scheme, featured as primary control, secondary control and tertiary control. Primary control is known as local control, featuring fast response. Secondary control is referred to as microgrid Energy Management System (EMS). Tertiary control is responsible of coordinating the operations of multi-microgrids. In this paper, we formulated 3 stage microgrid operation strategies which are similar to hierarchical control scheme. First stage is to set a day ahead scheduled output power of Battery Energy Storage System (BESS) which is only controllable source in microgrid and it is optimized to minimize cost of exchanged power with main grid using Particle Swarm Optimization (PSO) method. Second stage is to control the active and reactive power of BESS to be operated in day ahead scheduled plan in case that State of Charge (SOC) error occurs between real time and scheduled plan. The third is rescheduling the system when the predicted error is over the limited value. The first stage can be compared with the secondary control in that it adjusts the active power. The second stage is comparable to the primary control in that it controls the error in local manner. The third stage is compared with the secondary control in that it manages power balancing. The proposed strategies will be applied to one of the buildings in Electronics and Telecommunication Research Institute (ETRI). The building microgrid is composed of Photovoltaic (PV) generation, BESS and load and it will be interconnected with the main grid. Main purpose of that is minimizing operation cost and to be operated in scheduled plan. Simulation results support validation of proposed strategies.Keywords: Battery Energy Storage System (BESS), Energy Management System (EMS), Microgrid (MG), Particle Swarm Optimization (PSO)
Procedia PDF Downloads 24844377 A Study on the Correlation Analysis between the Pre-Sale Competition Rate and the Apartment Unit Plan Factor through Machine Learning
Authors: Seongjun Kim, Jinwooung Kim, Sung-Ah Kim
Abstract:
The development of information and communication technology also affects human cognition and thinking, especially in the field of design, new techniques are being tried. In architecture, new design methodologies such as machine learning or data-driven design are being applied. In particular, these methodologies are used in analyzing the factors related to the value of real estate or analyzing the feasibility in the early planning stage of the apartment housing. However, since the value of apartment buildings is often determined by external factors such as location and traffic conditions, rather than the interior elements of buildings, data is rarely used in the design process. Therefore, although the technical conditions are provided, the internal elements of the apartment are difficult to apply the data-driven design in the design process of the apartment. As a result, the designers of apartment housing were forced to rely on designer experience or modular design alternatives rather than data-driven design at the design stage, resulting in a uniform arrangement of space in the apartment house. The purpose of this study is to propose a methodology to support the designers to design the apartment unit plan with high consumer preference by deriving the correlation and importance of the floor plan elements of the apartment preferred by the consumers through the machine learning and reflecting this information from the early design process. The data on the pre-sale competition rate and the elements of the floor plan are collected as data, and the correlation between pre-sale competition rate and independent variables is analyzed through machine learning. This analytical model can be used to review the apartment unit plan produced by the designer and to assist the designer. Therefore, it is possible to make a floor plan of apartment housing with high preference because it is possible to feedback apartment unit plan by using trained model when it is used in floor plan design of apartment housing.Keywords: apartment unit plan, data-driven design, design methodology, machine learning
Procedia PDF Downloads 26844376 AI/ML Atmospheric Parameters Retrieval Using the “Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN)”
Authors: Thomas Monahan, Nicolas Gorius, Thanh Nguyen
Abstract:
Exoplanet atmospheric parameters retrieval is a complex, computationally intensive, inverse modeling problem in which an exoplanet’s atmospheric composition is extracted from an observed spectrum. Traditional Bayesian sampling methods require extensive time and computation, involving algorithms that compare large numbers of known atmospheric models to the input spectral data. Runtimes are directly proportional to the number of parameters under consideration. These increased power and runtime requirements are difficult to accommodate in space missions where model size, speed, and power consumption are of particular importance. The use of traditional Bayesian sampling methods, therefore, compromise model complexity or sampling accuracy. The Atmospheric Retrievals conditional Generative Adversarial Network (ARcGAN) is a deep convolutional generative adversarial network that improves on the previous model’s speed and accuracy. We demonstrate the efficacy of artificial intelligence to quickly and reliably predict atmospheric parameters and present it as a viable alternative to slow and computationally heavy Bayesian methods. In addition to its broad applicability across instruments and planetary types, ARcGAN has been designed to function on low power application-specific integrated circuits. The application of edge computing to atmospheric retrievals allows for real or near-real-time quantification of atmospheric constituents at the instrument level. Additionally, edge computing provides both high-performance and power-efficient computing for AI applications, both of which are critical for space missions. With the edge computing chip implementation, ArcGAN serves as a strong basis for the development of a similar machine-learning algorithm to reduce the downlinked data volume from the Compact Ultraviolet to Visible Imaging Spectrometer (CUVIS) onboard the DAVINCI mission to Venus.Keywords: deep learning, generative adversarial network, edge computing, atmospheric parameters retrieval
Procedia PDF Downloads 17044375 An Experimental Study on the Variability of Nonnative and Native Inference of Word Meanings in Timed and Untimed Conditions
Authors: Swathi M. Vanniarajan
Abstract:
Reading research suggests that online contextual vocabulary comprehension while reading is an interactive and integrative process. One’s success in it depends on a variety of factors including the amount and the nature of available linguistic and nonlinguistic cues, his/her analytical and integrative skills, schema memory (content familiarity), and processing speed characterized along the continuum of controlled to automatic processing. The experiment reported here, conducted with 30 native speakers as one group and 30 nonnative speakers as another group (all graduate students), hypothesized that while working on (24) tasks which required them to comprehend an unfamiliar word in real time without backtracking, due to the differences in the nature of their respective reading processes, the nonnative subjects would be less able to construct the meanings of the unknown words by integrating the multiple but sufficient contextual cues provided in the text but the native subjects would be able to. The results indicated that there were significant inter-group as well as intra-group differences in terms of the quality of definitions given. However, when given additional time, while the nonnative speakers could significantly improve the quality of their definitions, the native speakers in general would not, suggesting that all things being equal, time is a significant factor for success in nonnative vocabulary and reading comprehension processes and that accuracy precedes automaticity in the development of nonnative reading processes also.Keywords: reading, second language processing, vocabulary comprehension
Procedia PDF Downloads 16644374 Data Mining Meets Educational Analysis: Opportunities and Challenges for Research
Authors: Carla Silva
Abstract:
Recent development of information and communication technology enables us to acquire, collect, analyse data in various fields of socioeconomic – technological systems. Along with the increase of economic globalization and the evolution of information technology, data mining has become an important approach for economic data analysis. As a result, there has been a critical need for automated approaches to effective and efficient usage of massive amount of educational data, in order to support institutions to a strategic planning and investment decision-making. In this article, we will address data from several different perspectives and define the applied data to sciences. Many believe that 'big data' will transform business, government, and other aspects of the economy. We discuss how new data may impact educational policy and educational research. Large scale administrative data sets and proprietary private sector data can greatly improve the way we measure, track, and describe educational activity and educational impact. We also consider whether the big data predictive modeling tools that have emerged in statistics and computer science may prove useful in educational and furthermore in economics. Finally, we highlight a number of challenges and opportunities for future research.Keywords: data mining, research analysis, investment decision-making, educational research
Procedia PDF Downloads 35844373 Slope Stabilisation of Highly Fractured Geological Strata Consisting of Mica Schist Layers While Construction of Tunnel Shaft
Authors: Saurabh Sharma
Abstract:
Introduction: The case study deals with the ground stabilisation of Nabi Karim Metro Station in Delhi, India, wherein an extremely complex geology was encountered while excavating the tunnelling shaft for launching Tunnel Boring Machine. The borelog investigation and the Seismic Refraction Technique (SRT) indicated towards the presence of an extremely hard rocky mass from a depth of 3-4 m itself, and accordingly, the Geotechnical Interpretation Report (GIR) concluded the presence of Grade-IV rock from 3m onwards and presence of Grade-III and better rock from 5-6m onwards. Accordingly, it was planned to retain the ground by providing secant piles all around the launching shaft and then excavating the shaft vertically after leaving a berm of 1.5m to prevent secant piles from getting exposed. To retain the side slopes, rock bolting with shotcreting and wire meshing were proposed, which is a normal practice in such strata. However, with the increase in depth of excavation, the rock quality kept on decreasing at an unexpected and surprising pace, with the Grade-III rock mass at 5-6 m converting to conglomerate formation at the depth of 15m. This worsening of geology from high grade rock to slushy conglomerate formation can never be predicted and came as a surprise to even the best geotechnical engineers. Since the excavation had already been cut down vertically to manage the shaft size, the execution was continued with enhanced cautions to stabilise the side slopes. But, when the shaft work was about to finish, a collapse was encountered on one side of the excavation shaft. This collapse was unexpected and surprising since all measures to stabilise the side slopes had been taken after face mapping, and the grid size, diameter, and depth of the rockbolts had already been readjusted to accommodate rock fractures. The above scenario was baffling even to the best geologists and geotechnical engineers, and it was decided that any further slope stabilisation scheme shall have to be designed in such a way to ensure safe completion of works. Accordingly, following revisions to excavation scheme were made: The excavation would be carried while maintaining a slope based on type of soil/rock. The rock bolt type was changed from SN rockbolts to Self Drilling type anchor. The grid size of the bolts changed on real time assessment. the excavation carried out by implementing a ‘Bench Release Approach’. Aggressive Real Time Instrumentation Scheme. Discussion: The above case Study again asserts vitality of correct interpretation of the geological strata and the need of real time revisions of the construction schemes based on the actual site data. The excavation is successfully being done with the above revised scheme, and further details of the Revised Slope Stabilisation Scheme, Instrumentation Schemes, Monitoring results, along with the actual site photographs, shall form the part of the final Paper.Keywords: unconfined compressive strength (ucs), rock mass rating (rmr), rock bolts, self drilling anchors, face mapping of rock, secant pile, shotcrete
Procedia PDF Downloads 6644372 Estimating the Life-Distribution Parameters of Weibull-Life PV Systems Utilizing Non-Parametric Analysis
Authors: Saleem Z. Ramadan
Abstract:
In this paper, a model is proposed to determine the life distribution parameters of the useful life region for the PV system utilizing a combination of non-parametric and linear regression analysis for the failure data of these systems. Results showed that this method is dependable for analyzing failure time data for such reliable systems when the data is scarce.Keywords: masking, bathtub model, reliability, non-parametric analysis, useful life
Procedia PDF Downloads 56244371 Multi-Response Optimization of EDM for Ti-6Al-4V Using Taguchi-Grey Relational Analysis
Authors: Ritesh Joshi, Kishan Fuse, Gopal Zinzala, Nishit Nirmal
Abstract:
Ti-6Al-4V is a titanium alloy having high strength, low weight and corrosion resistant which is a required characteristic for a material to be used in aerospace industry. Titanium, being a hard alloy is difficult to the machine via conventional methods, so it is a call to use non-conventional processes. In present work, the effects on Ti-6Al-4V by drilling a hole of Ø 6 mm using copper (99%) electrode in Electric Discharge Machining (EDM) process is analyzed. Effect of various input parameters like peak current, pulse-on time and pulse-off time on output parameters viz material removal rate (MRR) and electrode wear rate (EWR) is studied. Multi-objective optimization technique Grey relational analysis is used for process optimization. Experiments are designed using an L9 orthogonal array. ANOVA is used for finding most contributing parameter followed by confirmation tests for validating the results. Improvement of 7.45% in gray relational grade is observed.Keywords: ANOVA, electric discharge machining, grey relational analysis, Ti-6Al-4V
Procedia PDF Downloads 36444370 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 6544369 Patching and Stretching: Development of Policy Mixes for Entrepreneurship in China
Authors: Jian Shao
Abstract:
The effect of entrepreneurship on economic, innovation, and employment has been widely acknowledged by scholars and governments. As an essential factor of influencing entrepreneurship activities, entrepreneurship policy creates a conducive environment to support and develop entrepreneurship. However, the challenge in developing entrepreneurship policy is that policy is normally a combination of many different goals and instruments. Instead of examining the effect of individual policy instruments, we argue that attention to a policy mix is necessary. In recent years, much attention has been focused on comparing a single policy instrument to a policy mix, evaluating the interactions between different instruments within a mix or assessment of particular policy mixes. However, another required step in understanding policy mixes is to understand how and why mixes evolve and change over time and to determine whether any changes are an improvement. In this paper, we try to trace the development of the policy mix for entrepreneurship in China by mapping the policy goals and instruments and reveal the process of policy mix changing over time. We find two main process mechanisms of the entrepreneurship policy mix in China: patching and stretching. Compared with policy repackaging, patching and stretching are more realistic processes in the real world of the policy mix, and they are possible to achieve effectiveness by avoiding conflicts and promoting synergies among policy goals and instruments.Keywords: entrepreneurship, China, policy design, policy mix, policy patching
Procedia PDF Downloads 19844368 Developing a Model for Information Giving Behavior in Virtual Communities
Authors: Pui-Lai To, Chechen Liao, Tzu-Ling Lin
Abstract:
Virtual communities have created a range of new social spaces in which to meet and interact with one another. Both as a stand-alone model or as a supplement to sustain competitive advantage for normal business models, building virtual communities has been hailed as one of the major strategic innovations of the new economy. However for a virtual community to evolve, the biggest challenge is how to make members actively give information or provide advice. Even in busy virtual communities, usually, only a small fraction of members post information actively. In order to investigate the determinants of information giving willingness of those contributors who usually actively provide their opinions, we proposed a model to understand the reasons for contribution in communities. The study will definitely serve as a basis for the future growth of information giving in virtual communities.Keywords: information giving, social identity, trust, virtual community
Procedia PDF Downloads 322