Abstracted/ Indexed in: Ulrich's International Periodical Directory, Google Scholar, SCIRUS, getCITED, Genamics JournalSeek, EBSCO Information Services
"Inventi Impact: Software Engineering" is a peer reviewed journal under Engineering & Technology. It invites articles from academicians, practicing engineers, and also from new generation college-dropped-out-computers-geeks. The journal aims to process the manuscripts without bothering the academic credentials or affiliation of the author.
The study treats a specific technological approach for the elaboration of small manufacturing\nseries of highly precise hyperboloid gears with small module of the teeth\nand with not big dimensions of the gear mechanism. It is based on the application of\nthe elaborated by authors mathematical models, algorithms and computer programs\nfor synthesis upon a pitch contact point and upon a mesh region. A special feature of\nthe established approach is the application of 3D software prototyping and 3D\nprinting of the designed transmissions. The presented here models of the transmissions\nwith crossed axes and face mated gears are indented for implementation into\nthe driving of two type robots: bio-robot hand and walking robot with four insect-\ntype legs....
In our previous work, we proposed wavelet shrinkage estimation (WSE) for nonhomogeneous Poisson process (NHPP)-based\r\nsoftware reliability models (SRMs), where WSE is a data-transform-based nonparametric estimation method. Among many\r\nvariance-stabilizing data transformations, the Anscombe transform and the Fisz transform were employed. We have shown that\r\nit could provide higher goodness-of-fit performance than the conventional maximum likelihood estimation (MLE) and the\r\nleast squares estimation (LSE) in many cases, in spite of its non-parametric nature, through numerical experiments with real\r\nsoftware-fault count data.With the aim of improving the estimation accuracy ofWSE, in this paper we introduce other three data\r\ntransformations to preprocess the software-fault count data and investigate the influence of different data transformations to the\r\nestimation accuracy ofWSE through goodness-of-fit test...
The advent of technology has opened unprecedented opportunities in health\ncare delivery system as the demand for intelligent and knowledge-based systems\nhas increased as modern medical practices become more knowledge-intensive.\nAs a result of this, there is greater need to investigate the pervasiveness of software\nfaults in Safety critical medical systems for proper diagnosis. The sheer\nvolume of code in these systems creates significant concerns about the quality\nof the software. The rate of untimely deaths nowadays is alarming partly due\nto the medical device used to carry out the diagnosis process. A safety-critical\nmedical (SCM) system is a complex system in which the malfunctioning of\nsoftware could result in death, injury of the patient or damage to the environment.\nThe malfunctioning of the software could be as a result of the inadequacy\nin software testing due to test suit problem or oracle problem. Testing a SCM\nsystem poses great challenges to software testers. One of these challenges is the\nneed to generate a limited number of test cases of a given regression test suite\nin a manner that does not compromise its defect detection ability. This paper\npresents a novel five-stage fault-based testing procedure for SCM, a model-based\napproach to generate test cases for differential diagnosis of Tuberculosis. We\nused Prime Path Coverage and Edge-Pair Coverage as coverage criteria to ensure\nmaximum coverage to identify feasible paths. We analyzed the proposed\ntesting procedure with the help of three metrics consisting of Fault Detection\nDensity, Fault Detection Effectiveness and Mutation Adequacy Score. We evaluated\nthe effectiveness of our testing procedure by running the suggested test\ncases on a sample historical data of tuberculosis patients. The experimental results\nshow that our developed testing procedure has some advantages such as\ncreating mutant graphs and Fuzzy Cognitive Map Engine while resolving the\nproblem of eliminating infeasible test cases for effective decision making....
Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-touse\ndeep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide\nautodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a\nneural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects\nare written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a\nsoftware library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are\nplentiful with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers,\nloss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the\nrobustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network\nemulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and\nradiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the modelâ??s emergent\nbehavior to be assessed, i.e., when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a\npreviously unrecognized strong relationship between offline validation error and online performance, in which the choice of the\noptimizer proves unexpectedly critical. This in turn reveals many new neural network architectures that produce considerable\nimprovements in climate model stability including some with reduced error, for an especially challenging training dataset....
New possibilities and challenges have evolved in the setting of the software engineering
sector’s rapid transition to Industry 5.0, wherein sustainability takes centre stage. Appropriate evaluation
approaches are required for analysing the long-term viability of software engineering practices
within this paradigm. This study proposes an innovative approach to evaluating sustainability in
software engineering within Industry 5.0 by utilising the fuzzy technique for order of preference by
similarity to ideal solution (fuzzy TOPSIS) methodology. The fuzzy TOPSIS approach is effective
at accounting for the inherent uncertainties as well as imprecisions related to sustainability assessments,
allowing for informed decision-making. This approach helps in the recognition of the most
sustainable software engineering practices in Industry 5.0 by taking into account a defined set of
sustainability parameters. We rigorously analyse the current literature and expert views to provide
an extensive set of relevant sustainability standards for the area of software engineering. Following
that, we develop an evaluation methodology based on fuzzy TOPSIS that can handle the subjectivity
as well as fuzziness inherent in sustainability evaluations. A case study with a software development
company functioning in Industry 5.0 demonstrates the utility and efficacy of our suggested framework.
The case study outcomes reveal the benefits and drawbacks of various software engineering
methodologies in terms of sustainability. The study’s findings provide substantial information for
decision-makers in the software engineering field, assisting them in making educated decisions about
sustainable. Finally, this study helps to establish environmentally and socially appropriate techniques
within the context of Industry 5.0....
This paper presents a bottom-up approach for a multiview measurement of statechart size, topological properties, and internal\r\nstructural complexity for understandability prediction and assurance purposes. It tackles the problem at different conceptual depths\r\nor equivalently at several abstraction levels. The main idea is to study and evaluate a statechart at different levels of granulation\r\ncorresponding to different conceptual depth levels or levels of details. The higher level corresponds to a flat process view diagram\r\n(depth = 0), the adequate upper depth limit is determined by the modelers according to the inherent complexity of the problem\r\nunder study and the level of detail required for the situation at hand (it corresponds to the all states view). For purposes of\r\nmeasurement, we proceed using bottom-up strategy starting with all state view diagram, identifying and measuring its deepest\r\ncomposite states constituent parts and then gradually collapsing them to obtain the next intermediate view (we decrement depth)\r\nwhile aggregating measures incrementally, until reaching the flat process view diagram. To this goal we first identify, define, and\r\nderive a relevant metrics suite useful to predict the level of understandability and other quality aspects of a statechart, and then we\r\npropose a fuzzy rule-based system prototype for understandability prediction, assurance, and for validation purposes....
This work aims to deliver the objective of developing an appropriate set of mathematical
models and a relevant software program to calculate the voltage distribution and energy consumption
of a Hall–Héroult reduction cell, together with a deeper understanding of the complex physical and
chemical phenomena underlying the alumina electrolysis process. The work involves an analysis
of the basic principles governing the alumina reduction process, the presentation of the sets of the
applied mathematical equations to predict the main electrolysis bath physicochemical properties
related to the cell voltage, the mass balance of the main cell material inputs and outputs, the energy
consumption of the electrolysis cell and the estimation of the cell voltage distribution along the
various cells consisting of elements. All the mathematical models were included in an easy-touse
software to enable the aluminium cell operators and engineers to introduce and retrieve all
the necessary cell operational data and study the effect of the key process parameters on the cell
The underlying infrastructure paradigms behind the novel usage scenarios and services
are becoming increasingly complex—from everyday life in smart cities to industrial environments.
Both the number of devices involved and their heterogeneity make the allocation of software components
quite challenging. Despite the enormous flexibility enabled by component-based software
engineering, finding the optimal allocation of software artifacts to the pool of available devices and
computation units could bring many benefits, such as improved quality of service (QoS), reduced
energy consumption, reduction of costs, and many others. Therefore, in this paper, we introduce a
model-based framework that aims to solve the software component allocation problem (CAP).We
formulate it as an optimization problem with either single or multiple objective functions and cover
both cases in the proposed framework. Additionally, our framework also provides visualization
and comparison of the optimal solutions in the case of multi-objective component allocation. The
main contributions introduced in this paper are: (1) a novel methodology for tackling CAP-alike
problems based on the usage of model-driven engineering (MDE) for both problem definition and
solution representation; (2) a set of Python tools that enable the workflow starting from the CAP
model interpretation, after that the generation of optimal allocations and, finally, result visualization.
The proposed framework is compared to other similar works using either linear optimization, genetic
algorithm (GA), and ant colony optimization (ACO) algorithm within the experiments based on
notable papers on this topic, covering various usage scenarios—from Cloud and Fog computing
infrastructure management to embedded systems, robotics, and telecommunications. According to
the achieved results, our framework performs much faster than GA and ACO-based solutions. Apart
from various benefits of adopting a multi-objective approach in many cases, it also shows significant
speedup compared to frameworks leveraging single-objective linear optimization, especially in the
case of larger problem models....
This paper describes the development of an application for mobile devices under the iOS\nplatform which has the objective of monitoring patients with alterations or affections from cardiac\npathologies. The software tool developed for mobile devices provides a patient and a specialist\ndoctor the ability to handle and treat disease remotely while monitoring through the technique of\nnon-contact photoplethysmography (PPG). The mobile application works by processing red, green,\nand blue (RGB) color video images on a specific region of the face, thus obtaining the intensity of the\npixels in the green channel. The results are then processed using mathematical algorithms and Fourier\ntransform, moving from the time domain to the frequency domain to ensure proper interpretation and\nto obtain the pulses per minute (PPM). The results are favorable because a comparison of the results\nwas made with respect to the application of a medical-grade pulse-oximeter, where an error rate of 3%\nwas obtained, indicating the acceptable performance of our application. The present technological\ndevelopment provides an application tool with significant potential in the area of health....
With the rapid development and wide application ofmultimedia technology, the demand for the actual development ofmultimedia\nsoftware inmany industries is increasing.How to measure and improve the quality ofmultimedia software is an important problem\nto be solved urgently. In order to calculate the complicated situation and fuzziness of software quality, this paper introduced a\nsoftware quality evaluationmodel based on the fuzzymatter element by using amethod known as the fuzzymatter element analysis,\ncombined with the TOPSIS method and the close degree. Compared with the existing typical software measurementmethods, the\nresults are basically consistent with the typical softwaremeasurement results.Then, Pearson simple correlation coefficientwas used\nto analyse the correlation between the existing four measurement methods and the metric of practical experience, whose results\nshow that the results of software quality measures based on fuzzy matter element aremore in accordance with practical experience.\nMeanwhile, the results of this method are muchmore precise than the results of the other measurementmethods....
� Copyright©2013. Inventi Journals Pvt.Ltd. All Right Reserved.