Search results for: revised COSO framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5340

Search results for: revised COSO framework

4800 eTransformation Framework for the Cognitive Systems

Authors: Ana Hol

Abstract:

Digital systems are in the cognitive wave of the eTransformations and are now extensively aimed at meeting the individuals’ demands, both those of customers requiring services and those of service providers. It is also apparent that successful future systems will not just simply open doors to the traditional owners/users to offer and receive services such as Uber for example does today, but will in the future require more customized and cognitively enabled infrastructures that will be responsive to the system user’s needs. To be able to identify what is required for such systems, this research reviews the historical and the current effects of the eTransformation process by studying: 1. eTransitions of company websites and mobile applications, 2. Emergence of new sheared economy business models as Uber and, 3. New requirements for demand driven, cognitive systems capable of learning and just in time decision making. Based on the analysis, this study proposes a Cognitive eTransformation Framework capable of guiding implementations of new responsive and user aware systems.

Keywords: system implementations, AI supported systems, cognitive systems, eTransformation

Procedia PDF Downloads 238
4799 Fast Generation of High-Performance Driveshafts: A Digital Approach to Automated Linked Topology and Design Optimization

Authors: Willi Zschiebsch, Alrik Dargel, Sebastian Spitzer, Philipp Johst, Robert Böhm, Niels Modler

Abstract:

In this article, we investigate an approach that digitally links individual development process steps by using the drive shaft of an aircraft engine as a representative example of a fiber polymer composite. Such high-performance, lightweight composite structures have many adjustable parameters that influence the mechanical properties. Only a combination of optimal parameter values can lead to energy efficient lightweight structures. The development tools required for the Engineering Design Process (EDP) are often isolated solutions, and their compatibility with each other is limited. A digital framework is presented in this study, which allows individual specialised tools to be linked via the generated data in such a way that automated optimization across programs becomes possible. This is demonstrated using the example of linking geometry generation with numerical structural analysis. The proposed digital framework for automated design optimization demonstrates the feasibility of developing a complete digital approach to design optimization. The methodology shows promising potential for achieving optimal solutions in terms of mass, material utilization, eigenfrequency, and deformation under lateral load with less development effort. The development of such a framework is an important step towards promoting a more efficient design approach that can lead to stable and balanced results.

Keywords: digital linked process, composite, CFRP, multi-objective, EDP, NSGA-2, NSGA-3, TPE

Procedia PDF Downloads 76
4798 Presenting a Knowledge Mapping Model According to a Comparative Study on Applied Models and Approaches to Map Organizational Knowledge

Authors: Ahmad Aslizadeh, Farid Ghaderi

Abstract:

Mapping organizational knowledge is an innovative concept and useful instrument of representation, capturing and visualization of implicit and explicit knowledge. There are a diversity of methods, instruments and techniques presented by different researchers following mapping organizational knowledge to reach determined goals. Implicating of these methods, it is necessary to know their exigencies and conditions in which those can be used. Integrating identified methods of knowledge mapping and comparing them would help knowledge managers to select the appropriate methods. This research conducted to presenting a model and framework to map organizational knowledge. At first, knowledge maps, their applications and necessity are introduced because of extracting comparative framework and detection of their structure. At the next step techniques of researchers such as Eppler, Kim, Egbu, Tandukar and Ebner as knowledge mapping models are presented and surveyed. Finally, they compare and a superior model would be introduced.

Keywords: knowledge mapping, knowledge management, comparative study, business and management

Procedia PDF Downloads 403
4797 Microplastics in Fish from Grenada, West Indies: Problems and Opportunities

Authors: Michelle E. Taylor, Clare E. Morrall

Abstract:

Microplastics are small particles produced for industrial purposes or formed by breakdown of anthropogenic debris. Caribbean nations import large quantities of plastic products. The Caribbean region is vulnerable to natural disasters and Climate Change is predicted to bring multiple additional challenges to island nations. Microplastics have been found in an array of marine environments and in a diversity of marine species. Occurrence of microplastic in the intestinal tracts of marine fish is a concern to human and ecosystem health as pollutants and pathogens can associate with plastics. Studies have shown that the incidence of microplastics in marine fish varies with species and location. Prevalence of microplastics (≤ 5 mm) in fish species from Grenadian waters (representing pelagic, semi-pelagic and demersal lifestyles) harvested for human consumption have been investigated via gut analysis. Harvested tissue was digested in 10% KOH and particles retained on a 0.177 mm sieve were examined. Microplastics identified have been classified according to type, colour and size. Over 97% of fish examined thus far (n=34) contained microplastics. Current and future work includes examining the invasive Lionfish (Pterois spp.) for microplastics, investigating marine invertebrate species as well as examining environmental sources of microplastics (i.e. rivers, coastal waters and sand). Owing to concerns of pollutant accumulation on microplastics and potential migration into organismal tissues, we plan to analyse fish tissue for mercury and other persistent pollutants. Despite having ~110,000 inhabitants, the island nation of Grenada imported approximately 33 million plastic bottles in 2013, of which it is estimated less than 5% were recycled. Over 30% of the imported bottles were ‘unmanaged’, and as such are potential litter/marine debris. A revised Litter Abatement Act passed into law in Grenada in 2015, but little enforcement of the law is evident to date. A local Non-governmental organization (NGO) ‘The Grenada Green Group’ (G3) is focused on reducing litter in Grenada through lobbying government to implement the revised act and running sessions in schools, community groups and on local media and social media to raise awareness of the problems associated with plastics. A local private company has indicated willingness to support an Anti-Litter Campaign in 2018 and local awareness of the need for a reduction of single use plastic use and litter seems to be high. The Government of Grenada have called for a Sustainable Waste Management Strategy and a ban on both Styrofoam and plastic grocery bags are among recommendations recently submitted. A Styrofoam ban will be in place at the St. George’s University campus from January 1st, 2018 and many local businesses have already voluntarily moved away from Styrofoam. Our findings underscore the importance of continuing investigations into microplastics in marine life; this will contribute to understanding the associated health risks. Furthermore, our findings support action to mitigate the volume of plastics entering the world’s oceans. We hope that Grenada’s future will involve a lot less plastic. This research was supported by the Caribbean Node of the Global Partnership on Marine Litter.

Keywords: Caribbean, microplastics, pollution, small island developing nation

Procedia PDF Downloads 211
4796 A Conceptual Framework to Study Cognitive-Affective Destination Images of Thailand among French Tourists

Authors: Ketwadee Madden

Abstract:

Product or service image is among the vital factors that predict individuals’ choice of buying a product or services, goes to a place or attached to a person. Similarly, in the context of tourism, the destination image is a very important factor to which tourist considers before making their tour destination decisions. In light of this, the objective of this study is to conceptually investigate among French tourists, the determinants of Thailand’s tourism destination image. For this objective to be achieved, prior studies were reviewed, leading to the development of conceptual framework highlighting the determinants of destination image. In addition, this study develops some hypotheses that are to be empirically investigated. Aside these, based on the conceptual findings, suggestions on how to motivate European tourists to chose Thailand as their preferred tourism destination were made.

Keywords: cognitive destination image, affective destination image, motivations, risk perception, word of mouth

Procedia PDF Downloads 139
4795 Deep Vision: A Robust Dominant Colour Extraction Framework for T-Shirts Based on Semantic Segmentation

Authors: Kishore Kumar R., Kaustav Sengupta, Shalini Sood Sehgal, Poornima Santhanam

Abstract:

Fashion is a human expression that is constantly changing. One of the prime factors that consistently influences fashion is the change in colour preferences. The role of colour in our everyday lives is very significant. It subconsciously explains a lot about one’s mindset and mood. Analyzing the colours by extracting them from the outfit images is a critical study to examine the individual’s/consumer behaviour. Several research works have been carried out on extracting colours from images, but to the best of our knowledge, there were no studies that extract colours to specific apparel and identify colour patterns geographically. This paper proposes a framework for accurately extracting colours from T-shirt images and predicting dominant colours geographically. The proposed method consists of two stages: first, a U-Net deep learning model is adopted to segment the T-shirts from the images. Second, the colours are extracted only from the T-shirt segments. The proposed method employs the iMaterialist (Fashion) 2019 dataset for the semantic segmentation task. The proposed framework also includes a mechanism for gathering data and analyzing India’s general colour preferences. From this research, it was observed that black and grey are the dominant colour in different regions of India. The proposed method can be adapted to study fashion’s evolving colour preferences.

Keywords: colour analysis in t-shirts, convolutional neural network, encoder-decoder, k-means clustering, semantic segmentation, U-Net model

Procedia PDF Downloads 111
4794 Exploiting Domino Games "Cassava H154M" in Order to Improve Students' Understanding about the Value of Trigonometry in Various Quadrants

Authors: Hisyam Hidayatullah

Abstract:

Utilization game on a lesson needs to be done in order to provide proper motoric learning model to improve students' skills. Approach to the game, as one of the models of a motoric learning, is intended to improve student learning outcomes math trigonometry materials generally that prioritize a Memory or rote. The purpose of this study is producting innovation to improve a cognitive abilities of students in the field, to improve student performance, and ultimately to improve student understanding in determining a value of trigonometry in various quadrants, and it apply a approach to the game Domino "Cassava H154M" who is adopted from cassava game and it has made total revised in cassava content. The game is divided into 3 sessions: sine cassava, cosine cassava and cassava tangent. Researchers using action of research method, which consists of several stages such as: planning, implementation, observation, reporting and evaluation. Researchers found that a game approaches can improve student learning outcomes, enhance students' creativity in terms of their motoric learning, and creating a supportive learning environment.

Keywords: cassava "H154M", motoric, value of trigonometry, quadrant

Procedia PDF Downloads 325
4793 The Monitoring of Surface Water Bodies from Tisa Catchment Area, Maramureş County in 2014

Authors: Gabriela-Andreea Despescu, Mădălina Mavrodin, Gheorghe Lăzăroiu, S. Nacu, R. Băstinaş

Abstract:

The Monitoring of Surface Water Bodies (Rivers) from Tisa Catchment Area - Maramureş County in 2014. This study is focused on the monitoring and evaluation of river’s water bodies from Maramureş County, using the methodology associated with the EU Water Framework Directive 60/2000. Thus, in the first part are defined the theoretical terms of monitoring activities related to the water bodies’ quality and the specific features of those we can find in the studied area. There are presented the water bodies’ features, quality indicators and the monitoring frequencies for the rivers situated in the Tisa catchment area. The results have shown the actual ecological and chemical state of those water bodies, in relation with the standard values mentioned through the Water Framework Directive.

Keywords: monitoring, surveillance, water bodies, quality

Procedia PDF Downloads 263
4792 Efficient Mercury Sorbent: Activated Carbon and Metal Organic Framework Hybrid

Authors: Yongseok Hong, Kurt Louis Solis

Abstract:

In the present study, a hybrid sorbent using the metal organic framework (MOF), UiO-66, and powdered activated carbon (pAC) is synthesized to remove cationic and anionic metals simultaneously. UiO-66 is an octahedron-shaped MOF with a Zr₆O₄(OH)₄ metal node and 1,4-benzene dicarboxylic acid (BDC) organic linker. Zr-based MOFs are attractive for trace element remediation in wastewaters, because Zr is relatively non-toxic as compared to other classes of MOF and, therefore, it will not cause secondary pollution. Most remediation studies with UiO-66 target anions such as fluoride, but trace element oxyanions such as arsenic, selenium, and antimony have also been investigated. There have also been studies involving mercury removal by UiO-66 derivatives, however these require post-synthetic modifications or have lower effective surface areas. Activated carbon is known for being a readily available, well-studied, effective adsorbent for metal contaminants. Solvothermal method was employed to prepare hybrid sorbent from UiO66 and activated carbon, which could be used to remove mercury and selenium simultaneously. The hybrid sorbent was characterized using FSEM-EDS, FT-IR, XRD, and TGA. The results showed that UiO66 and activated carbon are successfully composited. From BET studies, the hybrid sorbent has a SBET of 1051 m² g⁻¹. Adsorption studies were performed, where the hybrid showed maximum adsorption of 204.63 mg g⁻¹ and 168 mg g⁻¹ for Hg (II) and selenite, respectively, and follows the Langmuir model for both species. Kinetics studies have revealed that the Hg uptake of the hybrid is pseudo-2nd order and has rate constant of 5.6E-05 g mg⁻¹ min⁻¹ and the selenite uptake follows the simplified Elovich model with α = 2.99 mg g⁻¹ min⁻¹, β = 0.032 g mg⁻¹.

Keywords: adsorption, flue gas wastewater, mercury, selenite, metal organic framework

Procedia PDF Downloads 175
4791 Framework for Performance Measure of Super Resolution Imaging

Authors: Varsha Hemant Patil, Swati A. Bhavsar, Abolee H. Patil

Abstract:

Image quality assessment plays an important role in image evaluation. This paper aims to present an investigation of classic techniques in use for image quality assessment, especially for super-resolution imaging. Researchers have contributed a lot towards the development of super-resolution imaging techniques. However, not much attention is paid to the development of metrics for testing the performance of developed techniques. In this paper, the study report of existing image quality measures is given. The paper classifies reviewed approaches according to functionality and suitability for super-resolution imaging. Probable modifications and improvements of these to suit super-resolution imaging are presented. The prime goal of the paper is to provide a comprehensive reference source for researchers working towards super-resolution imaging and suggest a better framework for measuring the performance of super-resolution imaging techniques.

Keywords: interpolation, MSE, PSNR, SSIM, super resolution

Procedia PDF Downloads 98
4790 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control

Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch

Abstract:

As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.

Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids

Procedia PDF Downloads 122
4789 Mobile Agents-Based Framework for Dynamic Resource Allocation in Cloud Computing

Authors: Safia Rabaaoui, Héla Hachicha, Ezzeddine Zagrouba

Abstract:

Nowadays, cloud computing is becoming the more popular technology to various companies and consumers, which benefit from its increased efficiency, cost optimization, data security, unlimited storage capacity, etc. One of the biggest challenges of cloud computing is resource allocation. Its efficiency directly influences the performance of the whole cloud environment. Finding an effective method to address these critical issues and increase cloud performance was necessary. This paper proposes a mobile agents-based framework for dynamic resource allocation in cloud computing to minimize both the cost of using virtual machines and the makespan. Furthermore, its impact on the best response time and power consumption has been studied. The simulation showed that our method gave better results than here.

Keywords: cloud computing, multi-agent system, mobile agent, dynamic resource allocation, cost, makespan

Procedia PDF Downloads 103
4788 Shock Compressibility of Iron Alloys Calculated in the Framework of Quantum-Statistical Models

Authors: Maxim A. Kadatskiy, Konstantin V. Khishchenko

Abstract:

Iron alloys are widespread components in various types of structural materials which are exposed to intensive thermal and mechanical loads. Various quantum-statistical cell models with the approximation of self-consistent field can be used for the prediction of the behavior of these materials under extreme conditions. The application of these models is even more valid, the higher the temperature and the density of matter. Results of Hugoniot calculation for iron alloys in the framework of three quantum-statistical (the Thomas–Fermi, the Thomas–Fermi with quantum and exchange corrections and the Hartree–Fock–Slater) models are presented. Results of quantum-statistical calculations are compared with results from other reliable models and available experimental data. It is revealed a good agreement between results of calculation and experimental data for terra pascal pressures. Advantages and disadvantages of this approach are shown.

Keywords: alloy, Hugoniot, iron, terapascal pressure

Procedia PDF Downloads 342
4787 A Goal-Driven Crime Scripting Framework

Authors: Hashem Dehghanniri

Abstract:

Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.

Keywords: attack modelling, crime commission process, crime script, situational crime prevention

Procedia PDF Downloads 126
4786 Identifying Organizational Culture to Implement Knowledge Management: Case Study of BKN, Indonesia

Authors: Maria Margaretha, Elin Cahyaningsih, Dana Indra Sensuse Lukman

Abstract:

One of key success an organization can be seen from its culture. Employee, environment, and so on are factors for organization to achieve goals and build a competitive advantage. Type of organizational culture can be a guide to implementing Knowledge Management (KM) in organization especially in BKN. Culture will determine behavior of employees or environment to support KM. This paper describes the process to decide which culture does organization belong and suggestion and creating strategic moves in the future to implement KM. OCAI (Organizational Culture Assessment Instrument) and its framework (Competing Value Framework) were used to decide the type of organizational culture. To implement KM in organization, clan is an appropriate culture, because clan culture represent cultural values and leader type to implement a successful KM. Result of the measurement will be references for BKN to improve organization culture to achieve its goals and organization effectiveness.

Keywords: organizational culture, government, knowledge management, OCAI

Procedia PDF Downloads 621
4785 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 224
4784 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP

Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost

Abstract:

The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.

Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)

Procedia PDF Downloads 426
4783 Autoimmune Diseases Associated to Autoimmune Hepatitis: A Retrospective Study of 24 Tunisian Patients

Authors: Soumaya Mrabet, Imen Akkari, Amira Atig, Elhem Ben Jazia

Abstract:

Introduction: Autoimmune hepatitis (AIH) is a chronic inflammatory liver disease of unknown cause. Concomitant autoimmune disorders have been described in 30–50% of patients with AIH. The aim of our study is to determine the prevalence and the type of autoimmune disorders associated with AIH. Material and Methods: It is a retrospective study over a period of 16 years (2000-2015) including all patients followed for AIH. The diagnosis of AHI was based on the criteria of the revised International AIH group scoring system (IAIHG). Results: Twenty-for patients (21 women and 3 men) followed for AIH were collected. The mean age was 39 years (17-65 years). Among these patients, 11 patients(45.8%) had at least one autoimmune disease associated to AIH. These diseases were Hashimoto's thyroiditis (n = 5), Gougerot Sjogren syndrome (n=5), Primary biliary cirrhosis (n=2), Primitive sclerosant Cholangitis (n=1), Addison disease (n = 1) and systemic sclerosis (n=1). Patients were treated with corticosteroids alone or with azathioprine associated to the specific treatment of associated diseases with complete remission of AIH in 90% of cases and clinical improvement of other diseases. Conclusion: In our study, the prevalence of autoimmune diseases in AIH patients was 45.8%. These diseases were dominated by autoimmune thyroiditis and Gougerot Sjogren syndrome. The investigation of autoimmune diseases in autoimmune hepatitis must be systematic because of their frequency and the importance of adequate management.

Keywords: autoimmune diseases, autoimmune hepatitis, autoimmune thyroiditis, gougerot sjogren syndrome

Procedia PDF Downloads 263
4782 NanoSat MO Framework: Simulating a Constellation of Satellites with Docker Containers

Authors: César Coelho, Nikolai Wiegand

Abstract:

The advancement of nanosatellite technology has opened new avenues for cost-effective and faster space missions. The NanoSat MO Framework (NMF) from the European Space Agency (ESA) provides a modular and simpler approach to the development of flight software and operations of small satellites. This paper presents a methodology using the NMF together with Docker for simulating constellations of satellites. By leveraging Docker containers, the software environment of individual satellites can be easily replicated within a simulated constellation. This containerized approach allows for rapid deployment, isolation, and management of satellite instances, facilitating comprehensive testing and development in a controlled setting. By integrating the NMF lightweight simulator in the container, a comprehensive simulation environment was achieved. A significant advantage of using Docker containers is their inherent scalability, enabling the simulation of hundreds or even thousands of satellites with minimal overhead. Docker's lightweight nature ensures efficient resource utilization, allowing for deployment on a single host or across a cluster of hosts. This capability is crucial for large-scale simulations, such as in the case of mega-constellations, where multiple traditional virtual machines would be impractical due to their higher resource demands. This ability for easy horizontal scaling based on the number of simulated satellites provides tremendous flexibility to different mission scenarios. Our results demonstrate that leveraging Docker containers with the NanoSat MO Framework provides a highly efficient and scalable solution for simulating satellite constellations, offering not only significant benefits in terms of resource utilization and operational flexibility but also enabling testing and validation of ground software for constellations. The findings underscore the importance of taking advantage of already existing technologies in computer science to create new solutions for future satellite constellations in space.

Keywords: containerization, docker containers, NanoSat MO framework, satellite constellation simulation, scalability, small satellites

Procedia PDF Downloads 49
4781 Structural Reliability of Existing Structures: A Case Study

Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol

Abstract:

A reliability-based methodology for the analysis assessment and evaluation of reinforced concrete structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for structural elements are verified by the results obtained from the deterministic methods. The analysis outcomes of reliability-based analysis are compared against the safety limits of the required reliability index β according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) related to the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the reinforced concrete elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.

Keywords: structural reliability, concrete structures, FORM, Monte Carlo simulation

Procedia PDF Downloads 518
4780 Measuring Entrepreneurial Success through Specific Sustainable Development Goals by Linking Entrepreneurship Attitude and Intentions

Authors: Mohit Taneja, Ravi Kiran, S. C. Bose

Abstract:

Entrepreneurs’ role in achieving Sustainable development goals is crucial as the growth potential of any region depends upon the number and the success rate of entrepreneurial firms. This paper is an effort to examine the relationship between Sustainable growth (SG) with Entrepreneurial attitude (EA) and Entrepreneurial intention (EI) in the context of the Indian economy. The mediation effect of EI between EA and SG has been considered. Partial least square (PLS) –Structural Equation Model (SEM) software was used to design the framework. Students enrolled in entrepreneurship courses of higher educational institutes (HEI) of Punjab, Haryana, and the National Capital Region NCR were contacted for data collection. The National Institutional Ranking Framework (NIRF) framework was used in selecting HEIs and data collected from 589 students was considered for analysis. McGee’s multi-dimensional scale for measuring ESE and the scale of Linan & Chen for measuring EI & ES (SG) was used. Results highlight that EA has a strong impact on EI (p≤ 0.001) and EI has a positive and strong relationship with SG (ES) as β value for the same is 0.683 (p≤ 0.001). The current study also reflects the mediating effect of EI among EA and ES, as the results show that the combined β value of both EA and EI (i.e.0.684*0.683= 0.467) is more than the direct influence of EA on ES (β=0.265). EA, with the mediating effect of EI can enhance the opportunity for achieving SG, which suggests that in order to increase the venture success rate and to attain SG, emphasis should be given to EI along with EA. The study has been investigated in three regions of India. Future studies can be extended to other South Asian countries for generalization.

Keywords: entrepreneurship, sustainable growth, entrepreneurship intention, entrepreneurship attitude

Procedia PDF Downloads 94
4779 Construction of Graph Signal Modulations via Graph Fourier Transform and Its Applications

Authors: Xianwei Zheng, Yuan Yan Tang

Abstract:

Classical window Fourier transform has been widely used in signal processing, image processing, machine learning and pattern recognition. The related Gabor transform is powerful enough to capture the texture information of any given dataset. Recently, in the emerging field of graph signal processing, researchers devoting themselves to develop a graph signal processing theory to handle the so-called graph signals. Among the new developing theory, windowed graph Fourier transform has been constructed to establish a time-frequency analysis framework of graph signals. The windowed graph Fourier transform is defined by using the translation and modulation operators of graph signals, following the similar calculations in classical windowed Fourier transform. Specifically, the translation and modulation operators of graph signals are defined by using the Laplacian eigenvectors as follows. For a given graph signal, its translation is defined by a similar manner as its definition in classical signal processing. Specifically, the translation operator can be defined by using the Fourier atoms; the graph signal translation is defined similarly by using the Laplacian eigenvectors. The modulation of the graph can also be established by using the Laplacian eigenvectors. The windowed graph Fourier transform based on these two operators has been applied to obtain time-frequency representations of graph signals. Fundamentally, the modulation operator is defined similarly to the classical modulation by multiplying a graph signal with the entries in each Fourier atom. However, a single Laplacian eigenvector entry cannot play a similar role as the Fourier atom. This definition ignored the relationship between the translation and modulation operators. In this paper, a new definition of the modulation operator is proposed and thus another time-frequency framework for graph signal is constructed. Specifically, the relationship between the translation and modulation operations can be established by the Fourier transform. Specifically, for any signal, the Fourier transform of its translation is the modulation of its Fourier transform. Thus, the modulation of any signal can be defined as the inverse Fourier transform of the translation of its Fourier transform. Therefore, similarly, the graph modulation of any graph signal can be defined as the inverse graph Fourier transform of the translation of its graph Fourier. The novel definition of the graph modulation operator established a relationship of the translation and modulation operations. The new modulation operation and the original translation operation are applied to construct a new framework of graph signal time-frequency analysis. Furthermore, a windowed graph Fourier frame theory is developed. Necessary and sufficient conditions for constructing windowed graph Fourier frames, tight frames and dual frames are presented in this paper. The novel graph signal time-frequency analysis framework is applied to signals defined on well-known graphs, e.g. Minnesota road graph and random graphs. Experimental results show that the novel framework captures new features of graph signals.

Keywords: graph signals, windowed graph Fourier transform, windowed graph Fourier frames, vertex frequency analysis

Procedia PDF Downloads 341
4778 Radical Web Text Classification Using a Composite-Based Approach

Authors: Kolade Olawande Owoeye, George R. S. Weir

Abstract:

The widespread of terrorism and extremism activities on the internet has become a major threat to the government and national securities due to their potential dangers which have necessitated the need for intelligence gathering via web and real-time monitoring of potential websites for extremist activities. However, the manual classification for such contents is practically difficult or time-consuming. In response to this challenge, an automated classification system called composite technique was developed. This is a computational framework that explores the combination of both semantics and syntactic features of textual contents of a web. We implemented the framework on a set of extremist webpages dataset that has been subjected to the manual classification process. Therein, we developed a classification model on the data using J48 decision algorithm, this is to generate a measure of how well each page can be classified into their appropriate classes. The classification result obtained from our method when compared with other states of arts, indicated a 96% success rate in classifying overall webpages when matched against the manual classification.

Keywords: extremist, web pages, classification, semantics, posit

Procedia PDF Downloads 145
4777 Developing and Testing a Questionnaire of Music Memorization and Practice

Authors: Diana Santiago, Tania Lisboa, Sophie Lee, Alexander P. Demos, Monica C. S. Vasconcelos

Abstract:

Memorization has long been recognized as an arduous and anxiety-evoking task for musicians, and yet, it is an essential aspect of performance. Research shows that musicians are often not taught how to memorize. While memorization and practice strategies of professionals have been studied, little research has been done to examine how student musicians learn to practice and memorize music in different cultural settings. We present the process of developing and testing a questionnaire of music memorization and musical practice for student musicians in the UK and Brazil. A survey was developed for a cross-cultural research project aiming at examining how young orchestral musicians (aged 7–18 years) in different learning environments and cultures engage in instrumental practice and memorization. The questionnaire development included members of a UK/US/Brazil research team of music educators and performance science researchers. A pool of items was developed for each aspect of practice and memorization identified, based on literature, personal experiences, and adapted from existing questionnaires. Item development took the varying levels of cognitive and social development of the target populations into consideration. It also considered the diverse target learning environments. Items were initially grouped in accordance with a single underlying construct/behavior. The questionnaire comprised three sections: a demographics section, a section on practice (containing 29 items), and a section on memorization (containing 40 items). Next, the response process was considered and a 5-point Likert scale ranging from ‘always’ to ‘never’ with a verbal label and an image assigned to each response option was selected, following effective questionnaire design for children and youths. Finally, a pilot study was conducted with young orchestral musicians from diverse learning environments in Brazil and the United Kingdom. Data collection took place in either one-to-one or group settings to facilitate the participants. Cognitive interviews were utilized to establish response process validity by confirming the readability and accurate comprehension of the questionnaire items or highlighting the need for item revision. Internal reliability was investigated by measuring the consistency of the item groups using the statistical test Cronbach’s alpha. The pilot study successfully relied on the questionnaire to generate data about the engagement of young musicians of different levels and instruments, across different learning and cultural environments, in instrumental practice and memorization. Interaction analysis of the cognitive interviews undertaken with these participants, however, exposed the fact that certain items, and the response scale, could be interpreted in multiple ways. The questionnaire text was, therefore, revised accordingly. The low Cronbach’s Alpha scores of many item groups indicated another issue with the original questionnaire: its low level of internal reliability. Several reasons for each poor reliability can be suggested, including the issues with item interpretation revealed through interaction analysis of the cognitive interviews, the small number of participants (34), and the elusive nature of the construct in question. The revised questionnaire measures 78 specific behaviors or opinions. It can be seen to provide an efficient means of gathering information about the engagement of young musicians in practice and memorization on a large scale.

Keywords: cross-cultural, memorization, practice, questionnaire, young musicians

Procedia PDF Downloads 123
4776 Governance Framework for an Emerging Trust Ecosystem with a Blockchain-Based Supply Chain

Authors: Ismael Ávila, José Reynaldo F. Filho, Vasco Varanda Picchi

Abstract:

The ever-growing consumer awareness of food provenance in Brazil is driving the creation of a trusted ecosystem around the animal protein supply chain. The traceability and accountability requirements of such an ecosystem demand a blockchain layer to strengthen the weak links in that chain. For that, direct involvement of the companies in the blockchain transactions, including as validator nodes of the network, implies formalizing a partnership with the consortium behind the ecosystem. Yet, their compliance standards usually require that a formal governance structure is in place before they agree with any membership terms. In light of such a strategic role of blockchain governance, the paper discusses a framework for tailoring a governance model for a blockchain-based solution aimed at the meat supply chain and evaluates principles and attributes in terms of their relevance to the development of a robust trust ecosystem.

Keywords: blockchain, governance, trust ecosystem, supply chain, traceability

Procedia PDF Downloads 120
4775 The Mediation Effect of PTSD and Aggression on the Relationship of Childhood Physical Abuse and Suicidal Behavior in Homeless People

Authors: Jina Hong, Seongeun Ryu, Sungeun You

Abstract:

Suicide rate among homeless people are much higher than one in the general population. The purpose of this study was to examine the mediating effect of PTSD and aggression in the relationship between childhood physical abuse and suicidal behavior among homeless people. One hundred one homeless were recruited from street and shelters in Korea. Face-to-face interviews were conducted by master’s level graduate students or facility employees of shelters. All participants completed the Suicidal Behaviors Questionnaire-Revised (SBQ-R), Life History of Aggression Questionnaire (LHAQ), Primary Care PTSD (PC-PTSD), and Traumatic Life Events Questionnaire (TLEQ). The average age of homeless people participated in the study was 55.2 years (SD = 10.7) with the age range of 30 to 87. Results indicated that PTSD symptoms and aggression fully mediated the relationship between childhood physical abuse and suicidal behavior among the homeless. These findings suggest the need for trauma-informed care for the homeless, and warrant the need for psychological services for PTSD and aggression in order to reduce suicide risk among homeless people.

Keywords: aggression, homeless, PTSD, suicidal behavior

Procedia PDF Downloads 381
4774 Mastering Test Automation: Bridging Gaps for Seamless QA

Authors: Rohit Khankhoje

Abstract:

The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.

Keywords: automation framework, API integration, test automation, test management tools

Procedia PDF Downloads 73
4773 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index

Authors: Todd Zhou, Mikhail Yurochkin

Abstract:

Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.

Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index

Procedia PDF Downloads 124
4772 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: artificial intelligence, computer science, criminal investigation, digital forensics

Procedia PDF Downloads 212
4771 A Systematic Review on Development of a Cost Estimation Framework: A Case Study of Nigeria

Authors: Babatunde Dosumu, Obuks Ejohwomu, Akilu Yunusa-Kaltungo

Abstract:

Cost estimation in construction is often difficult, particularly when dealing with risks and uncertainties, which are inevitable and peculiar to developing countries like Nigeria. Direct consequences of these are major deviations in cost, duration, and quality. The fundamental aim of this study is to develop a framework for assessing the impacts of risk on cost estimation, which in turn causes variabilities between contract sum and final account. This is very important, as initial estimates given to clients should reflect the certain magnitude of consistency and accuracy, which the client builds other planning-related activities upon, and also enhance the capabilities of construction industry professionals by enabling better prediction of the final account from the contract sum. In achieving this, a systematic literature review was conducted with cost variability and construction projects as search string within three databases: Scopus, Web of science, and Ebsco (Business source premium), which are further analyzed and gap(s) in knowledge or research discovered. From the extensive review, it was found that factors causing deviation between final accounts and contract sum ranged between 1 and 45. Besides, it was discovered that a cost estimation framework similar to Building Cost Information Services (BCIS) is unavailable in Nigeria, which is a major reason why initial estimates are very often inconsistent, leading to project delay, abandonment, or determination at the expense of the huge sum of money invested. It was concluded that the development of a cost estimation framework that is adjudged an important tool in risk shedding rather than risk-sharing in project risk management would be a panacea to cost estimation problems, leading to cost variability in the Nigerian construction industry by the time this ongoing Ph.D. research is completed. It was recommended that practitioners in the construction industry should always take into account risk in order to facilitate the rapid development of the construction industry in Nigeria, which should give stakeholders a more in-depth understanding of the estimation effectiveness and efficiency to be adopted by stakeholders in both the private and public sectors.

Keywords: cost variability, construction projects, future studies, Nigeria

Procedia PDF Downloads 209