Age | Commit message (Collapse) | Author |
|
|
|
|
|
The expression in a bit string comprehension is limited to a
literal bit string expression. That is, the following code
is legal:
<< <<X>> || X <- List >>
but not this code:
<< foo(X) || X <- List >>
The limitation is annoying. For one thing, tools that transform
the abstract format must be careful not to produce code such as:
<< begin
%% Some instrumentation code.
<<X>>
end || X <- List >>
One reason for the limitation could be that we'll get
reduce/reduce conflicts if we try to allow an arbitrary
expression in a bit string comprehension:
binary_comprehension -> '<<' expr '||' lc_exprs '>>' :
{bc,?anno('$1'),'$2','$4'}.
Unfortunately, there does not seem to be an easy way to work
around that problem. The best we can do is to allow 'expr_max'
expressions (as in the binary syntax):
binary_comprehension -> '<<' expr_max '||' lc_exprs '>>' :
{bc,?anno('$1'),'$2','$4'}.
That will work, but functions calls must be enclosed in
parentheses:
<< (foo(X)) || X <- List >>
|
|
* maint:
tools: Add a Cover test
tools: Fix wrong instrumentation of binary comprehensions
|
|
When cover instruments binary comprehensions it's generating a
{block, ...} abstract code term inside a {bc, ...} term that is causing
the evaluation to fail at runtime. Removing the block statement
eliminates the error.
The template of a bit string comprehension cannot have a counter since
it is not allowed to be a block.
|
|
|
|
cover:compile_beam and cover:compile_beam_directory crashed when
trying to compile a beam file without a 'file' attribute. This has
been corrected, so an error is returned instead.
|
|
|
|
This is introduced by ab435488a.
If a module includes lines which are less than 1, for example a module
which includes `eunit.hrl`, its cover output file misses the coverage lines.
|
|
|
|
* gomoripeti/tools/cover-no-beam/OTP-12806:
cover: handle undefined module when analysing to file
|
|
It is possible that not just the source but even the beam of a module
is not available when calling analyse_to_file.
For example when coverdata is imported from an old file and since then
a module was removed.
Before this fix cover:analyse_to_file/3 could possibly never return
because of a helper process crashed with error:undef and never reply
to the caller.
At the same time link the helper process to cover_server so any
further error won't let the caller waiting indefinitely.
|
|
If not unstuck: faulty error messages will appear in error_logger_warn_SUITE.
|
|
|
|
Use the 'raw', 'delayed_write', and 'read_head' options to speed up file
operations. The analysis at the end of:
ts:run(compiler, [batch,cover]).
is now roughly twice as fast.
|
|
Add functions for cover compilation and analysis on multiple
files. This allows for more parallelisation.
All functions for cover compilation can now take a list of
modules/files.
cover:analyse/analyze and cover:analyse_to_file/analyze_to_file can be
called without the Modules arguement in order to analyse all cover
compiled and imported modules, or with a list of modules.
Also, the number of lookups in ets tables is reduced, which has also
improved the performance when analysing and resetting cover data.
|
|
OTP-8188 introduced a fix for handling last expressions in
expressions like case, try and friends. However the fix did
not account that some of those expressions like receive may
have no clauses (only an after clause), leading to a function
clause error when cover compiling code with such expressions.
|
|
* nox/maps-support-cover/OTP-11764:
Support maps in cover
Conflicts:
lib/tools/src/cover.erl
|
|
|
|
The raw_abstract_v1 format that is currently used was introduced
in R9C. Beam files that old cannot be executed by the current
run-time system, so there is no need to continue the old formats.
Removing the support will increase coverage.
|
|
We want to see at least some coverage of cover itself.
|
|
* nox/tools/cover-record-update:
Properly munge record updates in cover
Don't munge record and field names in cover
|
|
|
|
Trees {record,Line,Arg,Name,Fields} were not munged.
|
|
They are bare atoms, atoms or variables in the abstract format, there is no need to
pass them through munge_expr/2.
|
|
This adds optional names to fun expressions. A named fun expression
is parsed as a tuple `{named_fun,Loc,Name,Clauses}` in erl_parse.
If a fun expression has a name, it must be present and be the same in
every of its clauses. The function name shadows the environment of the
expression shadowing the environment and it is shadowed by the
environment of the clauses' arguments. An unused function name triggers
a warning unless it is prefixed by _, just as every variable.
Variable _ is allowed as a function name.
It is not an error to put a named function in a record field default
value.
When transforming to Core Erlang, the named fun Fun is changed into
the following expression:
letrec 'Fun'/Arity =
fun (Args) ->
let <Fun> = 'Fun'/Arity
in Case
in 'Fun'/Arity
where Args is the list of arguments of 'Fun'/Arity and Case the
Core Erlang expression corresponding to the clauses of Fun.
This transformation allows us to entirely skip any k_var to k_local
transformation in the fun's clauses bodies.
|
|
Similarly to cover compiling from source
(in this case some user specified compiler options are allowed)
when cover compiling from existing beam
take a filtered list of compiler options from the beamfile.
This way e.g. export_all can be preserved. See use case in eb02beb1c3
|
|
|
|
|
|
When cover:stop(Node) was called on a non-existing node, a process
waiting for cover data from the node would hang forever. This has been
corrected.
|
|
Commit 29231033 made cover fall back to compile info if source was not
found in pwd or ../src. This isn't sufficient for source that lies in
subdirectories of ../src when beams and source have been moved since
compilation (eg. install of some OTP applications), so first try finding
source relative to the beam directory.
For example, given a beam path
/installed/path/to/app-1.0/ebin/root.beam
and a source path
/compiled/path/to/app/src/subdir/root.erl
then look for
/installed/path/to/app-1.0/ebin/../src/subdir/root.erl
before the source path.
|
|
Cover was rewriting guard clauses as non-remote calls.
That said, if a guard contains erlang:is_binary(Binary),
Cover was incorrectly removing the erlang prefix which
lead to errors if is_binary is not auto imported.
This commit keeps the abstract format as it is.
|
|
Whenever a module is compiled via compile:forms/2,
the source is set to current directory unless a source
option is passed to compile. This commit ensures that
cover passes the source information to compile:forms/2
to ensure the source won't be modified after the module
is cover compiled.
|
|
Prior to this commit, cover relied on a simple heuristic that
traverses directory from the beam file to find a source file.
The heuristic was maintained with this patch but, if it fails,
it fallbacks to the source value in the module compile info.
In order to illustrate how it works, one of the tests that
could not find its source now passes successfully (showing the
source lookup is more robust).
|
|
|
|
|
|
* siri/cover/new-bugs-r16/OTP-10638:
[cover] Cleanup by stopping cover between tests
[common_test] Stop cover on slave node after node is terminated
[test_server] Stop cover on node after node is terminated
[cover] Fix timing dependent bug in cover_SUITE:reconnect
[cover] Remove stopped node also from lost_nodes list
[cover] Don't mark stopped node as lost
|
|
Code written by Siri Hansen.
|
|
A nodes that was stopped with cover:stop/1 while marked as lost would
not be removed from the list of lost nodes. Therefore, if a nodeup was
later received for a node with the same name, it would be
reconnected. This has been corrected.
|
|
Nodes that were stopped with cover:stop/1 were marked as lost and
would be reconnected if a nodeup was later received for a node with
the same name. This has been corrected.
|
|
OTP-10523
Earlier, if the connection to a remote cover node was lost, all cover
data was lost and the cover_server on the remote node would die. This
would cause problems if there were cover compiled modules that would
still be executed since they would attempt to write to the no longer
existing ets tables belonging to the cover_server.
This commit changes this behavior so that the cover_server on the
remote node will survive connection loss and continue collecting cover
data. If the connection is re-established then the main node will sync
with the remote node again and cover data will not be lost (unless the
node was down).
|
|
OTP-10523
Since the will probably be cover compiled modules left in the remote
node, we want to keep the cover_server and it's ets tables even if
connection to remote node is lost. This way it will not crash (due to
ets:insert in non existing table) if functions in cover compiled
modules are executed.
|
|
OTP-10523
* Added cover:flush(Nodes), which will fetch data from remote nodes
without stopping cover on those nodes.
* Added cover:get_main_node(), which returns the node name of the main
node. This is used by test_server to avoid {error,not_main_node}
when a slave starts another slave (e.g. in test_server's own tests).
|
|
After stopping cover with cover:stop() there could still be a {'DOWN',...}
leftover message in the calling process's message queue.
This unexpected leftover could be eliminated if erlang:demonitor/2
with option flush would be used in certain points
|
|
|
|
File descriptors to import cover data are left opened.
When we export and import cover data many times,
leaked descriptors cause an error.
|
|
* ts/cover-with-export_all:
add user specified compiler options on form reloading
OTP-9204
|
|
|
|
Add write concurrancy to cover masters ?COVER_TABLE
|
|
|