Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BlockSparseArrays] Use new SparseArrayDOK type in BlockSparseArrays #1272

Merged
merged 34 commits into from
Dec 2, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
6401807
[NDTensors] Start SparseArrayDOKs module
mtfishman Nov 25, 2023
9e4142c
Reorganization
mtfishman Nov 26, 2023
0677dfa
Format
mtfishman Nov 26, 2023
995c3f9
Start making SparseArrayInterface module, to be used by SparseArrayDO…
mtfishman Nov 28, 2023
b696a7e
Update interface
mtfishman Nov 28, 2023
e984cbe
Add another test
mtfishman Nov 28, 2023
475ee31
Improved map
mtfishman Nov 28, 2023
36961f8
New SparseArrayDOKs using SparseArrayInterface
mtfishman Nov 28, 2023
25f8229
Fix namespace issues
mtfishman Nov 29, 2023
3b3c50d
One more namespace issue
mtfishman Nov 29, 2023
a20e819
More namespace issues
mtfishman Nov 29, 2023
d571bd9
Julia 1.6 backwards compatibility
mtfishman Nov 29, 2023
410df32
Use SparseArrayInterface in DiagonalArrays
mtfishman Nov 29, 2023
41c9816
Format
mtfishman Nov 29, 2023
420692a
Fix loading issue
mtfishman Nov 29, 2023
c6dcefe
Missing include, improve README
mtfishman Nov 29, 2023
2478166
[BlockSparseArrays] Start using SparseArrayDOK
mtfishman Nov 29, 2023
43929be
Small fixes
mtfishman Nov 30, 2023
f70a520
Merge branch 'main' into NDTensors_new_BlockSparseArrays
mtfishman Nov 30, 2023
41b92b8
Change SparseArray to SparseArrayDOK
mtfishman Nov 30, 2023
af5ff02
Format
mtfishman Nov 30, 2023
a1733f6
Temporarily remove broken tests
mtfishman Nov 30, 2023
1be1d00
Introduct AbstractSparseArray, start rewriting BlockSparseArray
mtfishman Dec 1, 2023
97f3df4
Move AbstractSparseArray to SparseArrayInterface
mtfishman Dec 1, 2023
32d375a
Improve testing and organization
mtfishman Dec 1, 2023
0ea3eee
DiagonalArrays reorganization and simplification
mtfishman Dec 1, 2023
f918aee
Get more BlockSparseArrays tests passing
mtfishman Dec 1, 2023
fc0ff14
Move arraytensor code to backup files
mtfishman Dec 1, 2023
8487d0c
Move arraystorage code to backup files
mtfishman Dec 1, 2023
fc9ff82
Try fixing tests
mtfishman Dec 1, 2023
b5b643d
Comment
mtfishman Dec 1, 2023
796f33d
Merge main
mtfishman Dec 1, 2023
4d1453d
Fix namespace issue
mtfishman Dec 1, 2023
1f04d9c
Remove arraytensor test
mtfishman Dec 1, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
[NDTensors] Start SparseArrayDOKs module
  • Loading branch information
mtfishman committed Nov 25, 2023
commit 6401807f237d937b2e3908ee2b07d94261f6a540
46 changes: 19 additions & 27 deletions NDTensors/src/NDTensors.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,33 +19,25 @@ using Strided
using TimerOutputs
using TupleTools

# TODO: List types, macros, and functions being used.
include("lib/AlgorithmSelection/src/AlgorithmSelection.jl")
using .AlgorithmSelection: AlgorithmSelection
include("lib/BaseExtensions/src/BaseExtensions.jl")
using .BaseExtensions: BaseExtensions
include("lib/SetParameters/src/SetParameters.jl")
using .SetParameters
include("lib/BroadcastMapConversion/src/BroadcastMapConversion.jl")
using .BroadcastMapConversion: BroadcastMapConversion
include("lib/Unwrap/src/Unwrap.jl")
using .Unwrap
include("lib/RankFactorization/src/RankFactorization.jl")
using .RankFactorization: RankFactorization
include("lib/TensorAlgebra/src/TensorAlgebra.jl")
using .TensorAlgebra: TensorAlgebra
include("lib/DiagonalArrays/src/DiagonalArrays.jl")
using .DiagonalArrays
include("lib/BlockSparseArrays/src/BlockSparseArrays.jl")
using .BlockSparseArrays
include("lib/NamedDimsArrays/src/NamedDimsArrays.jl")
using .NamedDimsArrays: NamedDimsArrays
include("lib/SmallVectors/src/SmallVectors.jl")
using .SmallVectors
include("lib/SortedSets/src/SortedSets.jl")
using .SortedSets
include("lib/TagSets/src/TagSets.jl")
using .TagSets
for lib in [
:AlgorithmSelection,
:BaseExtensions,
:SetParameters,
:BroadcastMapConversion,
:Unwrap,
:RankFactorization,
:TensorAlgebra,
:DiagonalArrays,
:SparseArrayDOKs,
:BlockSparseArrays,
:NamedDimsArrays,
:SmallVectors,
:SortedSets,
:TagSets,
]
include("lib/$(lib)/src/$(lib).jl")
@eval using .$lib: $lib
end

using Base: @propagate_inbounds, ReshapedArray, DimOrInd, OneTo

Expand Down
2 changes: 2 additions & 0 deletions NDTensors/src/abstractarray/similar.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
using .Unwrap: IsWrappedArray

## Custom `NDTensors.similar` implementation.
## More extensive than `Base.similar`.

Expand Down
3 changes: 3 additions & 0 deletions NDTensors/src/abstractarray/tensoralgebra/contract.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
using LinearAlgebra: BlasFloat
using .Unwrap: expose

# TODO: Delete these exports
export backend_auto, backend_blas, backend_generic

@eval struct GemmBackend{T}
Expand Down
5 changes: 4 additions & 1 deletion NDTensors/src/array/permutedims.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
## Create the Exposed version of Base.permutedims
using .Unwrap: Exposed, unexpose

# TODO: Move to `Unwrap` module.
# Create the Exposed version of Base.permutedims
function permutedims(E::Exposed{<:Array}, perm)
## Creating Mperm here to evaluate the permutation and
## avoid returning a Stridedview
Expand Down
2 changes: 2 additions & 0 deletions NDTensors/src/array/set_types.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
using .SetParameters: Position, set_parameters

"""
TODO: Use `Accessors.jl` notation:
```julia
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
using .BlockSparseArrays: BlockSparseArray
using .DiagonalArrays: DiagonalArray

# Used for dispatch to distinguish from Tensors wrapping TensorStorage.
# Remove once TensorStorage is removed.
const ArrayStorage{T,N} = Union{
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
# TODO: Change to:
# using .SparseArrayDOKs: SparseArrayDOK
using .BlockSparseArrays: SparseArray

# TODO: This is inefficient, need to optimize.
# Look at `contract_labels`, `contract_blocks` and `maybe_contract_blocks!` in:
# src/blocksparse/contract_utilities.jl
Expand Down
7 changes: 7 additions & 0 deletions NDTensors/src/imports.jl
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
# Makes `cpu` available as `NDTensors.cpu`.
# TODO: Define `cpu`, `cu`, etc. in a module `DeviceAbstractions`,
# similar to:
# https://github.com/JuliaGPU/KernelAbstractions.jl
# https://github.com/oschulz/HeterogeneousComputing.jl
using .Unwrap: cpu

import Base:
# Types
AbstractFloat,
Expand Down
156 changes: 156 additions & 0 deletions NDTensors/src/lib/SparseArrayDOKs/src/SparseArrayDOKs.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
module SparseArrayDOKs
using Dictionaries: Dictionary, set!
using SparseArrays: SparseArrays, AbstractSparseArray

# Also look into:
# https://juliaarrays.github.io/ArrayInterface.jl/stable/sparsearrays/

# Required `SparseArrayInterface` interface.
# https://github.com/Jutho/SparseArrayKit.jl interface functions
nonzero_keys(a::AbstractArray) = error("Not implemented")
nonzero_values(a::AbstractArray) = error("Not implemented")
nonzero_pairs(a::AbstractArray) = error("Not implemented")

# A dictionary-like structure
# TODO: Rename `nonzeros`, `structural_nonzeros`, etc.?
nonzero_structure(a::AbstractArray) = error("Not implemented")

# Derived `SparseArrayInterface` interface.
nonzero_length(a::AbstractArray) = length(nonzero_keys(a))
is_structural_nonzero(a::AbstractArray, I) = I ∈ nonzero_keys(a)

# Overload if zero value is index dependent or
# doesn't match element type.
zerotype(a::AbstractArray) = eltype(a)
getindex_nonzero(a::AbstractArray, I) = nonzero_structure(a)[I]
getindex_zero(a::AbstractArray, I) = zero(zerotype(a))
function setindex_zero!(a::AbstractArray, value, I)
# TODO: This may need to be modified.
nonzero_structure(a)[I] = value
return a
end
function setindex_nonzero!(a::AbstractArray, value, I)
nonzero_structure(a)[I] = value
return a
end

struct Zero
end
(::Zero)(type, I) = zero(type)

default_zero_type(type::Type) = type
default_zero() = Zero() # (eltype, I) -> zero(eltype)
default_keytype(ndims::Int) = CartesianIndex{ndims}
default_data(type::Type, ndims::Int) = Dictionary{default_keytype(ndims),type}()

# TODO: Define a constructor with a default `zero`.
struct SparseArrayDOK{T,N,ZT,Zero} <: AbstractSparseArray{T,Int,N}
data::Dictionary{CartesianIndex{N},T}
dims::NTuple{N,Int}
zero::Zero
end

# Constructors
function SparseArrayDOK{T,N,ZT,Zero}(dims::Tuple{Vararg{Int}}, zero) where {T,N,ZT,Zero}
return SparseArrayDOK{T,N,ZT,Zero}(default_data(T, N), dims, zero)
end

function SparseArrayDOK{T,N,ZT}(dims::Tuple{Vararg{Int}}, zero) where {T,N,ZT}
return SparseArrayDOK{T,N,ZT,typeof(zero)}(dims, default_zero())
end

function SparseArrayDOK{T,N,ZT}(dims::Tuple{Vararg{Int}}) where {T,N,ZT}
return SparseArrayDOK{T,N,ZT}(dims, default_zero())
end

function SparseArrayDOK{T,N}(dims::Tuple{Vararg{Int}}) where {T,N}
return SparseArrayDOK{T,N,default_zero_type(T)}(dims)
end

function SparseArrayDOK{T}(dims::Tuple{Vararg{Int}}) where {T}
return SparseArrayDOK{T,length(dims)}(dims)
end

function SparseArrayDOK{T}(dims::Int...) where {T}
return SparseArrayDOK{T}(dims)
end

# undef
function SparseArrayDOK{T,N,ZT,Zero}(::UndefInitializer, dims::Tuple{Vararg{Int}}, zero) where {T,N,ZT,Zero}
return SparseArrayDOK{T,N,ZT,Zero}(dims, zero)
end

function SparseArrayDOK{T,N,ZT}(::UndefInitializer, dims::Tuple{Vararg{Int}}, zero) where {T,N,ZT}
return SparseArrayDOK{T,N,ZT}(dims, zero)
end

function SparseArrayDOK{T,N,ZT}(::UndefInitializer, dims::Tuple{Vararg{Int}}) where {T,N,ZT}
return SparseArrayDOK{T,N,ZT}(dims)
end

function SparseArrayDOK{T,N}(::UndefInitializer, dims::Tuple{Vararg{Int}}) where {T,N}
return SparseArrayDOK{T,N}(dims)
end

function SparseArrayDOK{T}(::UndefInitializer, dims::Tuple{Vararg{Int}}) where {T}
return SparseArrayDOK{T}(dims)
end

function SparseArrayDOK{T}(::UndefInitializer, dims::Int...) where {T}
return SparseArrayDOK{T}(dims...)
end

# Required `SparseArrayInterface` interface
nonzero_structure(a::SparseArrayDOK) = a.data
# TODO: Make this a generic function.
nonzero_keys(a::SparseArrayDOK) = keys(nonzero_structure(a))

# Julia Base `AbstractSparseArray` interface
SparseArrays.nnz(a::SparseArrayDOK) = nonzero_length(a)

# Optional SparseArray interface
# TODO: Use `SetParameters`.
zerotype(a::SparseArrayDOK{<:Any,<:Any,ZT}) where {ZT} = ZT
zero_value(a::SparseArrayDOK, I) = a.zero(zerotype(a), I)

# Accessors
Base.size(a::SparseArrayDOK) = a.dims

function Base.getindex(a::SparseArrayDOK{<:Any,N}, I::Vararg{Int,N}) where {N}
return a[CartesianIndex(I)]
end

function Base.getindex(a::SparseArrayDOK{<:Any,N}, I::CartesianIndex{N}) where {N}
if !is_structural_nonzero(a, I)
return getindex_zero(a, I)
end
return getindex_nonzero(a, I)
end

# `SparseArrayInterface` interface
function setindex_zero!(a::SparseArrayDOK, value, I)
# TODO: This is specific to the `Dictionaries.jl`
# interface, make more generic?
set!(nonzero_structure(a), I, value)
return a
end


function Base.setindex!(a::SparseArrayDOK{<:Any,N}, value, I::Vararg{Int,N}) where {N}
a[CartesianIndex(I)] = value
return a
end

function Base.setindex!(a::SparseArrayDOK{<:Any,N}, value, I::CartesianIndex{N}) where {N}
if !is_structural_nonzero(a, I)
setindex_zero!(a, value, I)
end
setindex_nonzero!(a, value, I)
return a
end

# similar
# TODO: How does this deal with the converting the zero type?
Base.similar(a::SparseArrayDOK{T,N,ZT,Zero}) where {T,N,ZT,Zero} = SparseArrayDOK{T,N,ZT,Zero}(undef, size(a), a.zero)

end
43 changes: 43 additions & 0 deletions NDTensors/src/lib/SparseArrayDOKs/test/runtests.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
@eval module $(gensym())
using Test: @test, @testset
using NDTensors.SparseArrayDOKs: SparseArrayDOK, nonzero_keys, nonzero_length
using SparseArrays: nnz
@testset "SparseArrayDOK (eltype=$elt)" for elt in (
Float32, ComplexF32, Float64, ComplexF64
)
a = SparseArrayDOK{elt}(3, 4)
@test a == SparseArrayDOK{elt}((3, 4))
@test a == SparseArrayDOK{elt}(undef, 3, 4)
@test a == SparseArrayDOK{elt}(undef, (3, 4))
@test iszero(a)
@test iszero(nnz(a))
@test nonzero_length(a) == nnz(a)
@test size(a) == (3, 4)
@test eltype(a) == elt
for I in eachindex(a)
@test iszero(a[I])
@test a[I] isa elt
end
@test isempty(nonzero_keys(a))

x12 = randn(elt)
x23 = randn(elt)
b = copy(a)
@test b isa SparseArrayDOK{elt}
@test iszero(b)
b[1, 2] = x12
b[2, 3] = x23
@test iszero(a)
@test !iszero(b)
@test b[1, 2] == x12
@test b[2, 3] == x23

# To test:
# reshape
# zero (PermutedDimsArray)
# map[!]
# broadcast
# Custom zero type
# conversion to `SparseMatrixCSC`
end
end
6 changes: 6 additions & 0 deletions NDTensors/src/lib/Unwrap/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
# Unwrap

A module to unwrap complex array types to assist in the generic programming of array-type based functions.

Related:
- https://juliaarrays.github.io/ArrayInterface.jl/stable/wrapping/
- https://github.com/JuliaGPU/Adapt.jl
- https://github.com/chengchingwen/StructWalk.jl
- https://github.com/FluxML/Functors.jl
1 change: 1 addition & 0 deletions NDTensors/test/lib/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ using Test: @testset
"SetParameters",
"SmallVectors",
"SortedSets",
"SparseArrayDOKs",
"TagSets",
"TensorAlgebra",
"Unwrap",
Expand Down
4 changes: 3 additions & 1 deletion src/imports.jl
Original file line number Diff line number Diff line change
Expand Up @@ -110,14 +110,16 @@ import LinearAlgebra:
tr,
transpose

using ITensors.NDTensors.Unwrap:
cpu

using ITensors.NDTensors:
Algorithm,
@Algorithm_str,
EmptyNumber,
_Tuple,
_NTuple,
blas_get_num_threads,
cpu,
cu,
disable_auto_fermion,
double_precision,
Expand Down
1 change: 1 addition & 0 deletions test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ ITensors = "9136182c-28ba-11e9-034c-db9fb085ebd5"
JLD2 = "033835bb-8acc-5ee8-8aae-3f567f8a3819"
KrylovKit = "0b1a1467-8014-51b9-945f-bf0ae24f4b77"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
NDTensors = "23ae76d9-e61a-49c4-8f12-3f1a16adf9cf"
OptimKit = "77e91f04-9b3b-57a6-a776-40b61faaebe0"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
QuadGK = "1fd47b50-473d-5c70-9696-f719f8f3bcdc"
Expand Down
Loading