Search results for: parallel projection methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16442

Search results for: parallel projection methods

15212 Virtual Customer Integration in Innovation Development: A Systematic Literature Review

Authors: Chau Nguyen Pham Minh

Abstract:

The aim of this study is to answer the following research question: What do we know about virtual customer integration in innovation development based on existing empirical research? The paper is based on a systematic review of 136 articles which were published in the past 16 years. The analysis focuses on three areas: what forms of virtual customer integration (e.g. netnography, online co-creation, virtual experience) have been applied in innovation development; how have virtual customer integration methods effectively been utilized by firms; and what are the influences of virtual customer integration on innovation development activities? Through the detailed analysis, the study provides researchers with broad understanding about virtual customer integration in innovation development. The study shows that practitioners and researchers increasingly pay attention on using virtual customer integration methods in developing innovation since those methods have dominant advantages in interact with customers in order to generate the best ideas for innovation development. Additionally, the findings indicate that netnography has been the most common method in integrating with customers for idea generation; while virtual product experience has been mainly used in product testing. Moreover, the analysis also reveals the positive and negative influences of virtual customer integration in innovation development from both process and strategic perspectives. Most of the review studies examined the phenomenon from company’s perspectives to understand the process of applying virtual customer integration methods and their impacts; however, the customers’ perspective on participating in the virtual interaction has been inadequately studied; therefore, it creates many potential interesting research paths for future studies.

Keywords: innovation, virtual customer integration, co-creation, netnography, new product development

Procedia PDF Downloads 334
15211 Determination of the Local Elastic Moduli of Shungite by Laser Ultrasonic Spectroscopy

Authors: Elena B. Cherepetskaya, Alexander A.Karabutov, Vladimir A. Makarov, Elena A. Mironova, Ivan A. Shibaev

Abstract:

In our study, the object of laser ultrasonic testing was plane-parallel plate of shungit (length 41 mm, width 31 mm, height 15 mm, medium exchange density 2247 kg/m3). We used laser-ultrasonic defectoscope with wideband opto-acoustic transducer in our investigation of the velocities of longitudinal and shear elastic ultrasound waves. The duration of arising elastic pulses was less than 100 ns. Under known material thickness, the values of the velocities were determined by the time delay of the pulses reflected from the bottom surface of the sample with respect to reference pulses. The accuracy of measurement was 0.3% in the case of longitudinal wave velocity and 0.5% in the case of shear wave velocity (scanning pitch along the surface was 2 mm). On the base of found velocities of elastic waves, local elastic moduli of shungit (Young modulus, shear modulus and Poisson's ratio) were uniquely determined.

Keywords: laser ultrasonic testing , local elastic moduli, shear wave velocity, shungit

Procedia PDF Downloads 306
15210 Infringement of Patent Rights with Doctrine of Equivalent for Turkey

Authors: Duru Helin Ozaner

Abstract:

Due to the doctrine of equivalent, the words in the claims' sentences are insufficient for the protection area provided by the patent registration. While this situation widens the boundaries of the protection area, it also obscures the boundaries of the protected area of patents. In addition, it creates distrust for third parties. Therefore, the doctrine of equivalent aims to establish a balance between the rights of patent owners and the legal security of third parties. The current legal system of Turkey has been tried to be created as a parallel judicial system to the widely applied regulations. Therefore, the regulations regarding the protection provided by patents in the current Turkish legal system are similar to many countries. However, infringement through equivalent is common by third parties. This study, it is aimed to explain that the protection provided by the patent is not only limited to the words of the claims but also the wide-ranging protection provided by the claims for the doctrine of equivalence. This study is important to determine the limits of the protection provided by the patent right holder and to indicate the importance of the equivalent elements of the protection granted to the patent right holder.

Keywords: patent, infringement, intellectual property, the doctrine of equivalent

Procedia PDF Downloads 212
15209 Interior Design: Changing Values

Authors: Kika Ioannou Kazamia

Abstract:

This paper examines the action research cycle of the second phase of longitudinal research on sustainable interior design practices, between two groups of stakeholders, designers and clients. During this phase of the action research, the second step - the change stage - of Lewin’s change management model has been utilized to change values, approaches, and attitudes toward sustainable design practices among the participants. Affective domain learning theory is utilized to attach new values. Learning with the use of information technology, collaborative learning, and problem-based learning are the learning methods implemented toward the acquisition of the objectives. Learning methods, and aims, require the design of interventions with participants' involvement in activities that would lead to the acknowledgment of the benefits of sustainable practices. Interventions are steered to measure participants’ decisions for the worth and relevance of ideas, and experiences; accept or commit to a particular stance or action. The data collection methods used in this action research are observers’ reports, participants' questionnaires, and interviews. The data analyses use both quantitative and qualitative methods. The main beneficial aspect of the quantitative method was to provide the means to separate many factors that obscured the main qualitative findings. The qualitative method allowed data to be categorized, to adapt the deductive approach, and then examine for commonalities that could reflect relevant categories or themes. The results from the data indicate that during the second phase, designers and clients' participants altered their behaviours.

Keywords: design, change, sustainability, learning, practices

Procedia PDF Downloads 76
15208 Crystal Structure, Vibration Study, and Calculated Frequencies by Density Functional Theory Method of Copper Phosphate Dihydrate

Authors: Soufiane Zerraf, Malika Tridane, Said Belaaouad

Abstract:

CuHPO₃.2H₂O was synthesized by the direct method. CuHPO₃.2H₂O crystallizes in the orthorhombic system, space group P2₁2₁2₁, a = 6.7036 (2) Å, b = 7.3671 (4) Å, c = 8.9749 (4) Å, Z = 4, V = 443.24 (4) ų. The crystal structure was refined to R₁= 0.0154, R₂= 0.0380 for 19018 reflections satisfying criterion I ≥ 2σ (I). The structural resolution shows the existence of chains of ions HPO₃- linked together by hydrogen bonds. The crystalline structure is formed by chains consisting of Cu[O₃(H₂O)₃] deformed octahedral, which are connected to the vertices. The chains extend parallel to b and are mutually linked by PO₃ groups. The structure is closely related to that of CuSeO₃.2H₂O and CuTeO₃.2H₂O. The experimental studies of the infrared and Raman spectra were used to confirm the presence of the phosphate ion and were compared in the (0-4000) cm-1 region with the theoretical results calculated by the density functional theory (DFT) method to provide reliable assignments of all observed bands in the experimental spectra.

Keywords: crystal structure, X-ray diffraction, vibration study, thermal behavior, density functional theory

Procedia PDF Downloads 115
15207 A General Variable Neighborhood Search Algorithm to Minimize Makespan of the Distributed Permutation Flowshop Scheduling Problem

Authors: G. M. Komaki, S. Mobin, E. Teymourian, S. Sheikh

Abstract:

This paper addresses minimizing the makespan of the distributed permutation flow shop scheduling problem. In this problem, there are several parallel identical factories or flowshops each with series of similar machines. Each job should be allocated to one of the factories and all of the operations of the jobs should be performed in the allocated factory. This problem has recently gained attention and due to NP-Hard nature of the problem, metaheuristic algorithms have been proposed to tackle it. Majority of the proposed algorithms require large computational time which is the main drawback. In this study, a general variable neighborhood search algorithm (GVNS) is proposed where several time-saving schemes have been incorporated into it. Also, the GVNS uses the sophisticated method to change the shaking procedure or perturbation depending on the progress of the incumbent solution to prevent stagnation of the search. The performance of the proposed algorithm is compared to the state-of-the-art algorithms based on standard benchmark instances.

Keywords: distributed permutation flow shop, scheduling, makespan, general variable neighborhood search algorithm

Procedia PDF Downloads 353
15206 Cultural Embeddedness of E-Participation Methods in Hungary

Authors: Hajnalka Szarvas

Abstract:

The research examines the effectiveness of e-participation tools and methods from a point of view of cultural fitting to the Hungarian community traditions. Participation can have very different meanings depending on the local cultural and historical traditions, experiences of the certain societies. Generally when it is about e-democracy or e-participation tools most of the researches are dealing with its technological sides and novelties, but there is not much said about the cultural and social context of the different platforms. However from the perspective of their success it would be essential to look at the human factor too, the actual users, how the certain DMS or any online platform is fitting to the way of thought, the way of functioning of the certain society. Therefore the paper will explore that to what extent the different online platforms like Loomio, Democracy OS, Your Priorities EVoks, Populus, miutcank.hu, Liquid Democracy, Brain Bar Budapest Lab are compatible with the Hungarian mental structures and community traditions, the contents of collective mind about community functioning. As a result the influence of cultural embeddedness of the logic of e-participation development tools on success of these methods will be clearly seen. Furthermore the most crucial factors in general which determine the efficiency of e-participation development tools in Hungary will be demonstrated.

Keywords: cultural embeddedness, e-participation, local community traditions, mental structures

Procedia PDF Downloads 303
15205 Protein Remote Homology Detection by Using Profile-Based Matrix Transformation Approaches

Authors: Bin Liu

Abstract:

As one of the most important tasks in protein sequence analysis, protein remote homology detection has been studied for decades. Currently, the profile-based methods show state-of-the-art performance. Position-Specific Frequency Matrix (PSFM) is widely used profile. However, there exists noise information in the profiles introduced by the amino acids with low frequencies. In this study, we propose a method to remove the noise information in the PSFM by removing the amino acids with low frequencies called Top frequency profile (TFP). Three new matrix transformation methods, including Autocross covariance (ACC) transformation, Tri-gram, and K-separated bigram (KSB), are performed on these profiles to convert them into fixed length feature vectors. Combined with Support Vector Machines (SVMs), the predictors are constructed. Evaluated on two benchmark datasets, and experimental results show that these proposed methods outperform other state-of-the-art predictors.

Keywords: protein remote homology detection, protein fold recognition, top frequency profile, support vector machines

Procedia PDF Downloads 124
15204 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud

Authors: Shuen-Tai Wang, Yu-Ching Lin

Abstract:

With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.

Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine

Procedia PDF Downloads 158
15203 Learning Fashion Construction and Manufacturing Methods from the Past: Cultural History and Genealogy at the Middle Tennessee State University Historic Clothing Collection

Authors: Teresa B. King

Abstract:

In the millennial age, with more students desiring a fashion major yet fewer having sewing and manufacturing knowledge, this increases demand on academicians to adequately educate. While fashion museums have a prominent place for historical preservation, the need for apparel education via working collections of handmade or mass manufactured apparel is lacking in most universities in the United States, especially in the Southern region. Created in 1988, Middle Tennessee State University’s historic clothing collection provides opportunities to study apparel construction methods throughout history, to compare and apply to today’s construction and manufacturing methods, as well as to learn the cyclical nature/importance of historic styles on current and upcoming fashion. In 2019, a class exercise experiment was implemented for which students researched their family genealogy using Ancestry.com, identified the oldest visual media (photographs, etc.) available, and analyzed the garment represented in said media. The student then located a comparable garment in the historic collection and evaluated the construction methods of the ancestor’s time period. A class 'fashion' genealogy tree was created and mounted for public viewing/education. Results of this exercise indicated that student learning increased due to the 'personal/familial connection' as it triggered more interest in historical garments as related to the student’s own personal culture. Students better identified garments regarding the historical time period, fiber content, fabric, and construction methods utilized, thus increasing learning and retention. Students also developed increased learning and recognition of custom construction methods versus current mass manufacturing techniques, which impact today’s fashion industry. A longitudinal effort will continue with the growth of the historic collection and as students continue to utilize the historic clothing collection.

Keywords: ancestry, clothing history, fashion history, genealogy, historic fashion museum collection

Procedia PDF Downloads 134
15202 Quantifying Product Impacts on Biodiversity: The Product Biodiversity Footprint

Authors: Leveque Benjamin, Rabaud Suzanne, Anest Hugo, Catalan Caroline, Neveux Guillaume

Abstract:

Human products consumption is one of the main drivers of biodiversity loss. However, few pertinent ecological indicators regarding product life cycle impact on species and ecosystems have been built. Life cycle assessment (LCA) methodologies are well under way to conceive standardized methods to assess this impact, by taking already partially into account three of the Millennium Ecosystem Assessment pressures (land use, pollutions, climate change). Coupling LCA and ecological data and methods is an emerging challenge to develop a product biodiversity footprint. This approach was tested on three case studies from food processing, textile, and cosmetic industries. It allowed first to improve the environmental relevance of the Potential Disappeared Fraction of species, end-point indicator typically used in life cycle analysis methods, and second to introduce new indicators on overexploitation and invasive species. This type of footprint is a major step in helping companies to identify their impacts on biodiversity and to propose potential improvements.

Keywords: biodiversity, companies, footprint, life cycle assessment, products

Procedia PDF Downloads 325
15201 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images

Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir

Abstract:

The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.

Keywords: altitude estimation, drone, image processing, trajectory planning

Procedia PDF Downloads 109
15200 Risk Measure from Investment in Finance by Value at Risk

Authors: Mohammed El-Arbi Khalfallah, Mohamed Lakhdar Hadji

Abstract:

Managing and controlling risk is a topic research in the world of finance. Before a risky situation, the stakeholders need to do comparison according to the positions and actions, and financial institutions must take measures of a particular market risk and credit. In this work, we study a model of risk measure in finance: Value at Risk (VaR), which is a new tool for measuring an entity's exposure risk. We explain the concept of value at risk, your average, tail, and describe the three methods for computing: Parametric method, Historical method, and numerical method of Monte Carlo. Finally, we briefly describe advantages and disadvantages of the three methods for computing value at risk.

Keywords: average value at risk, conditional value at risk, tail value at risk, value at risk

Procedia PDF Downloads 440
15199 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 58
15198 Comparison of Effect of Pre-Stressed Strand Diameters Providing Beamm to Column Connection

Authors: Mustafa Kaya

Abstract:

In this study, the effect of pre-stressed strand diameters, providing the beam-to-column connections, was investigated from both experimental, and analytical aspects. In the experimental studies, the strength, stiffness, and energy dissipation capacities of the precast specimens comprising two pre-stressed strand samples of 12.70 mm, and 15.24 mm diameters, were compared with the reference specimen. The precast specimen with strands of 15.24 mm reached 96% of the maximum strength of the reference specimen; the amount of energy dissipated by this specimen until end of the test reached 48% of the amount of energy dissipated by the reference sample, and the stiffness of the same specimen at a 1.5% drift of reached 77% of the stiffness of the reference specimen at this drift. Parallel results were obtained during the analytical studies from the aspects of strength, and behavior, but the initial stiffness of the analytical models was lower than that of the test specimen.

Keywords: precast beam to column connection, moment resisting connection, post tensioned connections, finite element method

Procedia PDF Downloads 550
15197 Meet Automotive Software Safety and Security Standards Expectations More Quickly

Authors: Jean-François Pouilly

Abstract:

This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.

Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods

Procedia PDF Downloads 17
15196 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps

Authors: Yong Bum Shin

Abstract:

This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.

Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process

Procedia PDF Downloads 79
15195 Experimental Investigation of Flow Structure around a Rectangular Cylinder in Different Configurations

Authors: Cemre Polat, Dogan B. Saydam, Mustafa Soyler, Coskun Ozalp

Abstract:

In this study, the flow structure was investigated by particle imaging velocimetry (PIV) method at Re = 26000 for two different rectangular cylinders placed perpendicular and parallel to the flow direction. After obtaining streamwise and spanwise velocity data, average vorticity, streamlines, velocity magnitude, turbulence kinetic energy, root mean square of streamwise and spanwise velocity fluctuations are calculated, and critical points of flow structure are explained. As a result of the study, it was seen that the vertical configuration has less effect on the flow structure in the back region of the body compared to the horizontal configuration. When the streamwise velocity component is examined in both configurations, it is seen that the negative velocity component is stronger on the long sides compared to the short sides. It has been observed that the vertically positioned cylinder expands the flow separation point compared to the horizontally positioned cylinder; also the vertical cylinder creates an increase in turbulence kinetic energy compared to the horizontal cylinder.

Keywords: bluff body, flow characteristics, PIV, rectangular cylinder

Procedia PDF Downloads 150
15194 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion

Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro

Abstract:

In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.

Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment

Procedia PDF Downloads 42
15193 Worldbuilding as Critical Architectural Pedagogy

Authors: Jesse Rafeiro

Abstract:

This paper discusses worldbuilding as a pedagogical approach to the first-year architectural design studio. The studio ran for three consecutive terms between 2016-2018. Taking its departure from the fifty-five city narratives in Italo Calvino’s Invisible Cities, students collectively designed in a “nowhere” space where intersecting and diverging narratives could be played out. Along with Calvino, students navigated between three main exercises and their imposed limits to develop architectural insight at three scales simulating the considerations of architectural practice: detail, building, and city. The first exercise asked each student to design and model a ruin based on randomly assigned incongruent fragments. Each student was given one plan fragment and two section fragments from different Renaissance Treatises. The students were asked to translate these in alternating axonometric projection and model-making explorations. Although the fragments themselves were imposed, students were free to interpret how the drawings fit together by imagining new details and atypical placements. An undulating terrain model was introduced in the second exercise to ground the worldbuilding exercises. Here, students were required to negotiate with one another to design a city of ruins. Free to place their models anywhere on the site, the students were restricted by the negotiation of territories marked by other students and the requirement to provide thresholds, open spaces, and corridors. The third exercise introduced new life into the ruined city through a series of design interventions. Each student was assigned an atypical building program suggesting a place for an activity, human or nonhuman. The atypical nature of the programs challenged the triviality of functional planning through explorations in spatial narratives free from preconceived assumptions. By contesting, playing out, or dreaming responses to realities taught in other coursework, this third exercise actualized learnings that are too often self-contained in the silos of differing course agendas. As such, the studio fostered an initial worldbuilding space within which to sharpen sensibility and criticality for subsequent years of education.

Keywords: architectural pedagogy, critical pedagogy, Italo Calvino, worldbuilding

Procedia PDF Downloads 130
15192 Design, Modeling and Analysis of 2×2 Microstrip Patch Antenna Array System for 5G Applications

Authors: Vinay Kumar K. S., Shravani V., Spoorthi G., Udith K. S., Divya T. M., Venkatesha M.

Abstract:

In this work, the mathematical modeling, design and analysis of a 2×2 microstrip patch antenna array (MSPA) antenna configuration is presented. Array utilizes a tiny strip antenna module with two vertical slots for 5G applications at an operating frequency of 5.3 GHz. The proposed array of antennas where the phased array antenna systems (PAAS) are used ubiquitously everywhere, from defense radar applications to commercial applications like 5G/6G. Microstrip patch antennae with slot arrays for linear polarisation parallel and perpendicular to the axis, respectively, are fed through transverse slots in the side wall of the circular waveguide and fed through longitudinal slots in the small wall of the rectangular waveguide. The microstrip patch antenna is developed using Ansys HFSS (High-Frequency Structure Simulator), this simulation tool. The maximum gain of 6.14 dB is achieved at 5.3 GHz for a single MSPA. For 2×2 array structure, a gain of 7.713 dB at 5.3 GHz is observed. Such antennas find many applications in 5G devices and technology.

Keywords: Ansys HFSS, gain, return loss, slot array, microstrip patch antenna, 5G antenna

Procedia PDF Downloads 110
15191 Comparing Deep Architectures for Selecting Optimal Machine Translation

Authors: Despoina Mouratidis, Katia Lida Kermanidis

Abstract:

Machine translation (MT) is a very important task in Natural Language Processing (NLP). MT evaluation is crucial in MT development, as it constitutes the means to assess the success of an MT system, and also helps improve its performance. Several methods have been proposed for the evaluation of (MT) systems. Some of the most popular ones in automatic MT evaluation are score-based, such as the BLEU score, and others are based on lexical similarity or syntactic similarity between the MT outputs and the reference involving higher-level information like part of speech tagging (POS). This paper presents a language-independent machine learning framework for classifying pairwise translations. This framework uses vector representations of two machine-produced translations, one from a statistical machine translation model (SMT) and one from a neural machine translation model (NMT). The vector representations consist of automatically extracted word embeddings and string-like language-independent features. These vector representations used as an input to a multi-layer neural network (NN) that models the similarity between each MT output and the reference, as well as between the two MT outputs. To evaluate the proposed approach, a professional translation and a "ground-truth" annotation are used. The parallel corpora used are English-Greek (EN-GR) and English-Italian (EN-IT), in the educational domain and of informal genres (video lecture subtitles, course forum text, etc.) that are difficult to be reliably translated. They have tested three basic deep learning (DL) architectures to this schema: (i) fully-connected dense, (ii) Convolutional Neural Network (CNN), and (iii) Long Short-Term Memory (LSTM). Experiments show that all tested architectures achieved better results when compared against those of some of the well-known basic approaches, such as Random Forest (RF) and Support Vector Machine (SVM). Better accuracy results are obtained when LSTM layers are used in our schema. In terms of a balance between the results, better accuracy results are obtained when dense layers are used. The reason for this is that the model correctly classifies more sentences of the minority class (SMT). For a more integrated analysis of the accuracy results, a qualitative linguistic analysis is carried out. In this context, problems have been identified about some figures of speech, as the metaphors, or about certain linguistic phenomena, such as per etymology: paronyms. It is quite interesting to find out why all the classifiers led to worse accuracy results in Italian as compared to Greek, taking into account that the linguistic features employed are language independent.

Keywords: machine learning, machine translation evaluation, neural network architecture, pairwise classification

Procedia PDF Downloads 130
15190 Postmortem Magnetic Resonance Imaging as an Objective Method for the Differential Diagnosis of a Stillborn and a Neonatal Death

Authors: Uliana N. Tumanova, Sergey M. Voevodin, Veronica A. Sinitsyna, Alexandr I. Shchegolev

Abstract:

An important part of forensic and autopsy research in perinatology is the answer to the question of life and stillbirth. Postmortem magnetic resonance imaging (MRI) is an objective non-invasive research method that allows to store data for a long time and not to exhume the body to clarify the diagnosis. The purpose of the research is to study the possibilities of a postmortem MRI to determine the stillbirth and death of a newborn who had spontaneous breathing and died on the first day after birth. MRI and morphological data of a study of 23 stillborn bodies, prenatally dead at a gestational age of 22-39 weeks (Group I) and the bodies of 16 newborns who died from 2 to 24 hours after birth (Group II) were compared. Before the autopsy, postmortem MRI was performed on the Siemens Magnetom Verio 3T device in the supine position of the body. The control group for MRI studies consisted of 7 live newborns without lung disease (Group III). On T2WI in the sagittal projection was measured MR-signal intensity (SI) in the lung tissue (L) and shoulder muscle (M). During the autopsy, a pulmonary swimming test was evaluated, and macro- and microscopic studies were performed. According to the postmortem MRI, the highest values of mean SI of the lung (430 ± 27.99) and of the muscle (405.5 ± 38.62) on T2WI were detected in group I and exceeded the corresponding value of group II by 2.7 times. The lowest values were found in the control group - 77.9 ± 12.34 and 119.7 ± 6.3, respectively. In the group II, the lung SI was 1.6 times higher than the muscle SI, whereas in the group I and in the control group, the muscle SI was 2.1 times and 1.8 times larger than the lung. On the basis of clinical and morphological data, we calculated the formula for determining the breathing index (BI) during postmortem MRI: BI = SIL x SIM / 100. The mean value of BI in the group I (1801.14 ± 241.6) (values ranged from 756 to 3744) significantly higher than the corresponding average value of BI in the group II (455.89 ± 137.32, p < 0.05) (305-638.4). In the control group, the mean BI value was 91.75 ± 13.3 (values ranged from 53 to 154). The BI with the results of pulmonary swimming tests and microscopic examination of the lungs were compared. The boundary value of BI for the differential diagnosis of stillborn and newborn death was 700. Using the postmortem MRI allows to differentiate the stillborn with the death of the breathing newborn.

Keywords: lung, newborn, postmortem MRI, stillborn

Procedia PDF Downloads 126
15189 Developing Learning in Organizations with Innovation Pedagogy Methods

Authors: T. Konst

Abstract:

Most jobs include training and communication tasks, but often the people in these jobs lack pedagogical competences to plan, implement and assess learning. This paper aims to discuss how a learning approach called innovation pedagogy developed in higher education can be utilized for learning development in various organizations. The methods presented how to implement innovation pedagogy such as process consultation and train the trainer model can provide added value to develop pedagogical knowhow in organizations and thus support their internal learning and development.

Keywords: innovation pedagogy, learning, organizational development, process consultation

Procedia PDF Downloads 366
15188 Arabic Handwriting Recognition Using Local Approach

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.

Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM

Procedia PDF Downloads 69
15187 Research of Data Cleaning Methods Based on Dependency Rules

Authors: Yang Bao, Shi Wei Deng, WangQun Lin

Abstract:

This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.

Keywords: data cleaning, dependency rules, violation data discovery, data repair

Procedia PDF Downloads 562
15186 Effects and Mechanisms of an Online Short-Term Audio-Based Mindfulness Intervention on Wellbeing in Community Settings and How Stress and Negative Affect Influence the Therapy Effects: Parallel Process Latent Growth Curve Modeling of a Randomized Control

Authors: Man Ying Kang, Joshua Kin Man Nan

Abstract:

The prolonged pandemic has posed alarming public health challenges to various parts of the world, and face-to-face mental health treatment is largely discounted for the control of virus transmission, online psychological services and self-help mental health kits have become essential. Online self-help mindfulness-based interventions have proved their effects on fostering mental health for different populations over the globe. This paper was to test the effectiveness of an online short-term audio-based mindfulness (SAM) program in enhancing wellbeing, dispositional mindfulness, and reducing stress and negative affect in community settings in China, and to explore possible mechanisms of how dispositional mindfulness, stress, and negative affect influenced the intervention effects on wellbeing. Community-dwelling adults were recruited via online social networking sites (e.g., QQ, WeChat, and Weibo). Participants (n=100) were randomized into the mindfulness group (n=50) and a waitlist control group (n=50). In the mindfulness group, participants were advised to spend 10–20 minutes listening to the audio content, including mindful-form practices (e.g., eating, sitting, walking, or breathing). Then practice daily mindfulness exercises for 3 weeks (a total of 21 sessions), whereas those in the control group received the same intervention after data collection in the mindfulness group. Participants in the mindfulness group needed to fill in the World Health Organization Five Well-Being Index (WHO), Positive and Negative Affect Schedule (PANAS), Perceived Stress Scale (PSS), and Freiburg Mindfulness Inventory (FMI) four times: at baseline (T0) and at 1 (T1), 2 (T2), and 3 (T3) weeks while those in the waitlist control group only needed to fill in the same scales at pre- and post-interventions. Repeated-measure analysis of variance, paired sample t-test, and independent sample t-test was used to analyze the variable outcomes of the two groups. The parallel process latent growth curve modeling analysis was used to explore the longitudinal moderated mediation effects. The dependent variable was WHO slope from T0 to T3, the independent variable was Group (1=SAM, 2=Control), the mediator was FMI slope from T0 to T3, and the moderator was T0NA and T0PSS separately. The different levels of moderator effects on WHO slope was explored, including low T0NA or T0PSS (Mean-SD), medium T0NA or T0PSS (Mean), and high T0NA or T0PSS (Mean+SD). The results found that SAM significantly improved and predicted higher levels of WHO slope and FMI slope, as well as significantly reduced NA and PSS. FMI slope positively predict WHO slope. FMI slope partially mediated the relationship between SAM and WHO slope. Baseline NA and PSS as the moderators were found to be significant between SAM and WHO slope and between SAM and FMI slope, respectively. The conclusion was that SAM was effective in promoting levels of mental wellbeing, positive affect, and dispositional mindfulness as well as reducing negative affect and stress in community settings in China. SAM improved wellbeing faster through the faster enhancement of dispositional mindfulness. Participants with medium-to-high negative affect and stress buffered the therapy effects of SAM on wellbeing improvement speed.

Keywords: mindfulness, negative affect, stress, wellbeing, randomized control trial

Procedia PDF Downloads 108
15185 Auditing of Building Information Modeling Application in Decoration Engineering Projects in China

Authors: Lan Luo

Abstract:

In China’s construction industry, it is a normal practice to separately subcontract the decoration engineering part from construction engineering, and Building Information Modeling (BIM) is also done separately. Application of BIM in decoration engineering should be integrated with other disciplines, but Chinese current practice makes this very difficult and complicated. Currently, there are three barriers in the auditing of BIM application in decoration engineering in China: heavy workload; scarcity of qualified professionals; and lack of literature concerning audit contents, standards, and methods. Therefore, it is significant to perform research on what (contents) should be evaluated, in which phase, and by whom (professional qualifications) in BIM application in decoration construction so that the application of BIM can be promoted in a better manner. Based on this consideration, four principles of BIM auditing are proposed: Comprehensiveness of information, accuracy of data, aesthetic attractiveness of appearance, and scheme optimization. In the model audit, three methods should be used: Collision, observation, and contrast. In addition, BIM auditing at six stages is discussed and a checklist for work items and results to be submitted is proposed. This checklist can be used for reference by decoration project participants.

Keywords: audit, evaluation, dimensions, methods, standards, BIM application in decoration engineering projects

Procedia PDF Downloads 341
15184 Benders Decomposition Approach to Solve the Hybrid Flow Shop Scheduling Problem

Authors: Ebrahim Asadi-Gangraj

Abstract:

Hybrid flow shop scheduling problem (HFS) contains sequencing in a flow shop where, at any stage, there exist one or more related or unrelated parallel machines. This production system is a common manufacturing environment in many real industries, such as the steel manufacturing, ceramic tile manufacturing, and car assembly industries. In this research, a mixed integer linear programming (MILP) model is presented for the hybrid flow shop scheduling problem, in which, the objective consists of minimizing the maximum completion time (makespan). For this purpose, a Benders Decomposition (BD) method is developed to solve the research problem. The proposed approach is tested on some test problems, small to moderate scale. The experimental results show that the Benders decomposition approach can solve the hybrid flow shop scheduling problem in a reasonable time, especially for small and moderate-size test problems.

Keywords: hybrid flow shop, mixed integer linear programming, Benders decomposition, makespan

Procedia PDF Downloads 187
15183 Effect of Brewing on the Bioactive Compounds of Coffee

Authors: Ceyda Dadali, Yeşim Elmaci

Abstract:

Coffee was introduced as an economic crop during the fifteenth century; nowadays it is the most important food commodity ranking second after crude oil. Desirable sensory properties make coffee one of the most often consumed and most popular beverages in the world. The coffee preparation method has a significant effect on flavor and composition of coffee brews. Three different extraction methodologies namely decoction, infusion and pressure methods have been used for coffee brew preparation. Each of these methods is related to specific granulation (coffee grind) of coffee powder, water-coffee ratio temperature and brewing time. Coffee is a mixture of 1500 chemical compounds. Chemical composition of coffee highly depends on brewing methods, coffee bean species and roasting time-temperature. Coffee contains a wide number of very important bioactive compounds, such as diterpenes: cafestol and kahweol, alkaloids: caffeine, theobromine and trigonelline, melanoidins, phenolic compounds. The phenolic compounds of coffee include chlorogenic acids (quinyl esters of hidroxycinnamic acids), caffeic, ferulic, p-coumaric acid. In coffee caffeoylquinic acids, feruloylquinic acids and di-caffeoylquinic acids are three main groups of chlorogenic acids constitues 6% -10% of dry weight of coffee. The bioavailability of chlorogenic acids in coffee depends on the absorption and metabolization to biomarkers in individuals. Also, the interaction of coffee polyphenols with other compounds such as dietary proteins affects the biomarkers. Since bioactive composition of coffee depends on brewing methods effect of coffee brewing method on bioactive compounds of coffee will be discussed in this study.

Keywords: bioactive compounds of coffee, biomarkers, coffee brew, effect of brewing

Procedia PDF Downloads 194